Mar 20 15:23:01 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 15:23:01 crc restorecon[4684]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 15:23:01 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 15:23:02 crc restorecon[4684]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 15:23:03 crc kubenswrapper[4779]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 15:23:03 crc kubenswrapper[4779]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 15:23:03 crc kubenswrapper[4779]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 15:23:03 crc kubenswrapper[4779]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 15:23:03 crc kubenswrapper[4779]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 15:23:03 crc kubenswrapper[4779]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.540282 4779 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548262 4779 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548298 4779 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548311 4779 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548323 4779 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548332 4779 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548341 4779 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548351 4779 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548359 4779 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548368 4779 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548375 4779 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548384 4779 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548393 4779 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548402 4779 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548411 4779 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548420 4779 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548429 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548438 4779 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548446 4779 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548454 4779 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548462 4779 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548470 4779 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548478 4779 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548486 4779 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548494 4779 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548503 4779 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548514 4779 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548533 4779 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548542 4779 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548550 4779 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548557 4779 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548565 4779 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548573 4779 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548581 4779 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548589 4779 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548596 4779 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548604 4779 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548612 4779 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548620 4779 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548628 4779 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548637 4779 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548645 4779 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548653 4779 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548661 4779 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548669 4779 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548677 4779 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548685 4779 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548693 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548701 4779 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548708 4779 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548716 4779 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548727 4779 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548734 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548742 4779 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548750 4779 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548757 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548765 4779 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548773 4779 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548781 4779 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548790 4779 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548801 4779 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548811 4779 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548820 4779 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548828 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548840 4779 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548850 4779 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548859 4779 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548868 4779 feature_gate.go:330] unrecognized feature gate: Example Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548876 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548884 4779 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548892 4779 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.548900 4779 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550863 4779 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550890 4779 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550909 4779 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550922 4779 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550933 4779 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550943 4779 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550956 4779 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550967 4779 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550977 4779 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550986 4779 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.550996 4779 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551006 4779 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551015 4779 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551025 4779 flags.go:64] FLAG: --cgroup-root="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551034 4779 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551043 4779 flags.go:64] FLAG: --client-ca-file="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551052 4779 flags.go:64] FLAG: --cloud-config="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551061 4779 flags.go:64] FLAG: --cloud-provider="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551070 4779 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551080 4779 flags.go:64] FLAG: --cluster-domain="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551089 4779 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551098 4779 flags.go:64] FLAG: --config-dir="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551147 4779 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551163 4779 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551178 4779 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551188 4779 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551197 4779 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551207 4779 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551245 4779 flags.go:64] FLAG: --contention-profiling="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551255 4779 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551263 4779 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551273 4779 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551283 4779 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551295 4779 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551304 4779 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551313 4779 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551322 4779 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551331 4779 flags.go:64] FLAG: --enable-server="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551339 4779 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551353 4779 flags.go:64] FLAG: --event-burst="100" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551362 4779 flags.go:64] FLAG: --event-qps="50" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551371 4779 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551380 4779 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551389 4779 flags.go:64] FLAG: --eviction-hard="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551400 4779 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551410 4779 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551418 4779 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551428 4779 flags.go:64] FLAG: --eviction-soft="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551438 4779 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551449 4779 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551459 4779 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551468 4779 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551477 4779 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551486 4779 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551495 4779 flags.go:64] FLAG: --feature-gates="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551518 4779 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551528 4779 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551537 4779 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551546 4779 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551556 4779 flags.go:64] FLAG: --healthz-port="10248" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551565 4779 flags.go:64] FLAG: --help="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551574 4779 flags.go:64] FLAG: --hostname-override="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551583 4779 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551592 4779 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551604 4779 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551614 4779 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551623 4779 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551632 4779 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551640 4779 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551649 4779 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551659 4779 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551669 4779 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551680 4779 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551689 4779 flags.go:64] FLAG: --kube-reserved="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551699 4779 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551709 4779 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551718 4779 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551727 4779 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551737 4779 flags.go:64] FLAG: --lock-file="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551745 4779 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551756 4779 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551766 4779 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551779 4779 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551789 4779 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551798 4779 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551807 4779 flags.go:64] FLAG: --logging-format="text" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551816 4779 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551825 4779 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551834 4779 flags.go:64] FLAG: --manifest-url="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551842 4779 flags.go:64] FLAG: --manifest-url-header="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551854 4779 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551863 4779 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551874 4779 flags.go:64] FLAG: --max-pods="110" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551884 4779 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551893 4779 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551902 4779 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551912 4779 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551921 4779 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551930 4779 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551939 4779 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551959 4779 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551968 4779 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551977 4779 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551986 4779 flags.go:64] FLAG: --pod-cidr="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.551995 4779 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552011 4779 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552020 4779 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552028 4779 flags.go:64] FLAG: --pods-per-core="0" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552037 4779 flags.go:64] FLAG: --port="10250" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552047 4779 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552056 4779 flags.go:64] FLAG: --provider-id="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552065 4779 flags.go:64] FLAG: --qos-reserved="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552074 4779 flags.go:64] FLAG: --read-only-port="10255" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552084 4779 flags.go:64] FLAG: --register-node="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552092 4779 flags.go:64] FLAG: --register-schedulable="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552101 4779 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552155 4779 flags.go:64] FLAG: --registry-burst="10" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552164 4779 flags.go:64] FLAG: --registry-qps="5" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552173 4779 flags.go:64] FLAG: --reserved-cpus="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552183 4779 flags.go:64] FLAG: --reserved-memory="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552194 4779 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552203 4779 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552212 4779 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552221 4779 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552230 4779 flags.go:64] FLAG: --runonce="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552239 4779 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552249 4779 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552258 4779 flags.go:64] FLAG: --seccomp-default="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552267 4779 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552276 4779 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552285 4779 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552294 4779 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552304 4779 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552313 4779 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552321 4779 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552330 4779 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552339 4779 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552349 4779 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552358 4779 flags.go:64] FLAG: --system-cgroups="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552367 4779 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552380 4779 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552389 4779 flags.go:64] FLAG: --tls-cert-file="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552398 4779 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552408 4779 flags.go:64] FLAG: --tls-min-version="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552417 4779 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552426 4779 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552436 4779 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552445 4779 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552454 4779 flags.go:64] FLAG: --v="2" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552467 4779 flags.go:64] FLAG: --version="false" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552529 4779 flags.go:64] FLAG: --vmodule="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552540 4779 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.552550 4779 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552750 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552761 4779 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552774 4779 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552785 4779 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552795 4779 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552805 4779 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552814 4779 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552823 4779 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552831 4779 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552840 4779 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552849 4779 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552858 4779 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552868 4779 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552878 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552887 4779 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552895 4779 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552903 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552911 4779 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552920 4779 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552928 4779 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552935 4779 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552944 4779 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552952 4779 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552960 4779 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552968 4779 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552977 4779 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552985 4779 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.552993 4779 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553002 4779 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553009 4779 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553017 4779 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553025 4779 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553032 4779 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553040 4779 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553048 4779 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553055 4779 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553063 4779 feature_gate.go:330] unrecognized feature gate: Example Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553071 4779 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553080 4779 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553088 4779 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553095 4779 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553104 4779 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553135 4779 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553143 4779 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553152 4779 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553159 4779 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553170 4779 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553179 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553189 4779 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553197 4779 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553214 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553222 4779 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553231 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553239 4779 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553248 4779 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553256 4779 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553264 4779 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553272 4779 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553283 4779 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553293 4779 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553302 4779 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553311 4779 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553320 4779 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553329 4779 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553337 4779 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553345 4779 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553353 4779 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553361 4779 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553369 4779 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553377 4779 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.553385 4779 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.554258 4779 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.569308 4779 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.569353 4779 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569481 4779 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569494 4779 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569503 4779 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569513 4779 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569523 4779 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569533 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569541 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569549 4779 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569558 4779 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569566 4779 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569575 4779 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569584 4779 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569593 4779 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569605 4779 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569614 4779 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569623 4779 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569631 4779 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569638 4779 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569646 4779 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569654 4779 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569661 4779 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569669 4779 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569677 4779 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569685 4779 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569693 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569700 4779 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569709 4779 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569716 4779 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569724 4779 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569732 4779 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569740 4779 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569748 4779 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569756 4779 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569763 4779 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569773 4779 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569781 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569788 4779 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569796 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569804 4779 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569812 4779 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569820 4779 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569828 4779 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569835 4779 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569843 4779 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569851 4779 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569858 4779 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569866 4779 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569876 4779 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569884 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569892 4779 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569900 4779 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569907 4779 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569915 4779 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569923 4779 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569931 4779 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569939 4779 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569947 4779 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569954 4779 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569962 4779 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569973 4779 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569984 4779 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.569996 4779 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570005 4779 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570016 4779 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570026 4779 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570035 4779 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570070 4779 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570081 4779 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570090 4779 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570099 4779 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570138 4779 feature_gate.go:330] unrecognized feature gate: Example Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.570156 4779 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570393 4779 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570408 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570417 4779 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570427 4779 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570435 4779 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570445 4779 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570453 4779 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570462 4779 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570472 4779 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570482 4779 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570490 4779 feature_gate.go:330] unrecognized feature gate: Example Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570497 4779 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570506 4779 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570515 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570524 4779 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570532 4779 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570540 4779 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570547 4779 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570555 4779 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570563 4779 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570571 4779 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570579 4779 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570586 4779 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570594 4779 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570602 4779 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570610 4779 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570617 4779 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570625 4779 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570633 4779 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570641 4779 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570649 4779 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570657 4779 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570665 4779 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570672 4779 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570681 4779 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570689 4779 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570697 4779 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570704 4779 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570712 4779 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570720 4779 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570728 4779 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570736 4779 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570746 4779 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570757 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570766 4779 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570775 4779 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570783 4779 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570793 4779 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570801 4779 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570808 4779 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570816 4779 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570824 4779 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570832 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570842 4779 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570853 4779 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570861 4779 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570871 4779 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570881 4779 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570892 4779 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570900 4779 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570909 4779 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570917 4779 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570927 4779 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570935 4779 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570943 4779 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570953 4779 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570963 4779 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570973 4779 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570980 4779 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570988 4779 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.570997 4779 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.571009 4779 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.571303 4779 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.575646 4779 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.580094 4779 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.580267 4779 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.582300 4779 server.go:997] "Starting client certificate rotation" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.582360 4779 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.582576 4779 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.610417 4779 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.613862 4779 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.617830 4779 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.638798 4779 log.go:25] "Validated CRI v1 runtime API" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.680375 4779 log.go:25] "Validated CRI v1 image API" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.682716 4779 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.689279 4779 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-15-17-59-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.689408 4779 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.717341 4779 manager.go:217] Machine: {Timestamp:2026-03-20 15:23:03.713980334 +0000 UTC m=+0.676496184 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7cc3d574-fb22-4342-af05-25b72a2fc8bc BootID:dffc0140-562f-4f14-a68f-8b97216f21d0 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:88:0c:c3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:88:0c:c3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:6d:f3:c2 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4f:63:be Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a9:ca:82 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:17:eb:8a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b6:54:b0:44:cd:3b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8a:04:f5:60:29:57 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.717859 4779 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.718277 4779 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.720063 4779 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.720438 4779 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.720503 4779 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.720951 4779 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.720971 4779 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.721522 4779 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.721582 4779 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.722165 4779 state_mem.go:36] "Initialized new in-memory state store" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.722333 4779 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.725859 4779 kubelet.go:418] "Attempting to sync node with API server" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.725929 4779 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.726202 4779 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.726256 4779 kubelet.go:324] "Adding apiserver pod source" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.726288 4779 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.732287 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.732350 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.732403 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.732452 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.734241 4779 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.735601 4779 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.738415 4779 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740231 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740276 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740292 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740306 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740332 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740348 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740363 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740386 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740403 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740418 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740439 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.740453 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.741507 4779 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.742282 4779 server.go:1280] "Started kubelet" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.742702 4779 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.742744 4779 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.743755 4779 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 15:23:03 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.751317 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.751565 4779 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.751625 4779 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.752986 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.753063 4779 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.753154 4779 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.753095 4779 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.753570 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="200ms" Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.753932 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.754064 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.754256 4779 server.go:460] "Adding debug handlers to kubelet server" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.757737 4779 factory.go:55] Registering systemd factory Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.757848 4779 factory.go:221] Registration of the systemd container factory successfully Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.760507 4779 factory.go:153] Registering CRI-O factory Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.760565 4779 factory.go:221] Registration of the crio container factory successfully Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.760735 4779 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.760784 4779 factory.go:103] Registering Raw factory Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.760827 4779 manager.go:1196] Started watching for new ooms in manager Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.759952 4779 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.219:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e95f53c544b98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.742237592 +0000 UTC m=+0.704753422,LastTimestamp:2026-03-20 15:23:03.742237592 +0000 UTC m=+0.704753422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.762293 4779 manager.go:319] Starting recovery of all containers Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766443 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766547 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766571 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766593 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766614 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766634 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766652 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766673 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766697 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766718 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766738 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766756 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766775 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766800 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766820 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766838 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766862 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766888 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766907 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766932 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766959 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.766986 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767012 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767032 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767051 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767079 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767104 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767179 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767199 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767221 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767240 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767294 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767321 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767346 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767373 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767396 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767421 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767444 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767483 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767506 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767533 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767557 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767580 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767602 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767625 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767645 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767665 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767686 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767707 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767725 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767744 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767763 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767792 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767814 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767834 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767854 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767876 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767897 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767916 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767934 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767952 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767974 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.767992 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768013 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768033 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768052 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768070 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768093 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768142 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768162 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768181 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768201 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768221 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768241 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768262 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768281 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768301 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768322 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768343 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768367 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768394 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768420 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768449 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768474 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768493 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768512 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768533 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768554 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768572 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768593 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768613 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768632 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768650 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768673 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768691 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768709 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768730 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768746 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768763 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768781 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768802 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768820 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768842 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768860 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768887 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768907 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768929 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768948 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768970 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.768993 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769018 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769039 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769058 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769078 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769101 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769160 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769183 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769201 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769220 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769241 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769258 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769277 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769297 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769319 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769337 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769361 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769385 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769411 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769432 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769450 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769469 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769487 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769507 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769525 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769545 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769563 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769582 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769600 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769620 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769639 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769658 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769681 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769702 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769721 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769743 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769765 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769799 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769823 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769848 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769873 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769894 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769915 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769933 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769953 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769972 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.769990 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770079 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770103 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770153 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770172 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770191 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770211 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770229 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770248 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770267 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770287 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770356 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770382 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770408 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770433 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770455 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770476 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770496 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770517 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770537 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770555 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770573 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770591 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770612 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770630 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770650 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770668 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770685 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770704 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770723 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770743 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770761 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770780 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770797 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770817 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770837 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770868 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770888 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770905 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770924 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770942 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770960 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.770980 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.771000 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.771018 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.773668 4779 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.773719 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.773747 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.773794 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.773822 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.773848 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.773872 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.774001 4779 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.774032 4779 reconstruct.go:97] "Volume reconstruction finished" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.774051 4779 reconciler.go:26] "Reconciler: start to sync state" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.792928 4779 manager.go:324] Recovery completed Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.803516 4779 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.807419 4779 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.807508 4779 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.807548 4779 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.807630 4779 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.809246 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.809392 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.809484 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.813308 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.813337 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.813346 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.814366 4779 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.814380 4779 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.814397 4779 state_mem.go:36] "Initialized new in-memory state store" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.839851 4779 policy_none.go:49] "None policy: Start" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.842901 4779 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.842992 4779 state_mem.go:35] "Initializing new in-memory state store" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.853991 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.904390 4779 manager.go:334] "Starting Device Plugin manager" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.904450 4779 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.904465 4779 server.go:79] "Starting device plugin registration server" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.905009 4779 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.905034 4779 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.905295 4779 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.905383 4779 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.905398 4779 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.907716 4779 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.907827 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.908891 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.908938 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.908949 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.909135 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.910257 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.910287 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.910296 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.914229 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.914245 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.914302 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.914330 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.914357 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915583 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915619 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915632 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915774 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915798 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915812 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915826 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915875 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915901 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915912 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915916 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.915946 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.916645 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.916708 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.916731 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.916740 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.916841 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.916868 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.916902 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.917095 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.917201 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.917238 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.917986 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.918010 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.918019 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.918281 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.918349 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.918368 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.918654 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.918731 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.919704 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.919725 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.919733 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:03 crc kubenswrapper[4779]: E0320 15:23:03.954244 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="400ms" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.977646 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.977698 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.977721 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.977743 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.977776 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.977799 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.977818 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.977881 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.978320 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.978378 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.978421 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.978475 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.978505 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.978551 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: I0320 15:23:03.978571 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:03 crc kubenswrapper[4779]: W0320 15:23:03.984093 4779 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/cpuset.cpus.effective": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/cpuset.cpus.effective: no such device Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.006175 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.007149 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.007183 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.007193 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.007215 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:04 crc kubenswrapper[4779]: E0320 15:23:04.007758 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.219:6443: connect: connection refused" node="crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.079998 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080051 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080082 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080102 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080143 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080164 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080182 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080198 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080220 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080195 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080265 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080214 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080326 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080341 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080377 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080241 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080401 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080304 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080311 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080354 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080463 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080511 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080559 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080567 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080554 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080600 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080605 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080623 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080712 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.080816 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.208548 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.209614 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.209654 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.209685 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.209708 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:04 crc kubenswrapper[4779]: E0320 15:23:04.210172 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.219:6443: connect: connection refused" node="crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.281158 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.295492 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.302793 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.318802 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.321904 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:04 crc kubenswrapper[4779]: W0320 15:23:04.334158 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d5823b0e58237c0879a39184fb6a6ddfac9117f2ee8296011e26ebabf104206d WatchSource:0}: Error finding container d5823b0e58237c0879a39184fb6a6ddfac9117f2ee8296011e26ebabf104206d: Status 404 returned error can't find the container with id d5823b0e58237c0879a39184fb6a6ddfac9117f2ee8296011e26ebabf104206d Mar 20 15:23:04 crc kubenswrapper[4779]: W0320 15:23:04.335559 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1bba61da3af66af332eca5265af04cc1b54d6d138a98c9f65566db0d4034031a WatchSource:0}: Error finding container 1bba61da3af66af332eca5265af04cc1b54d6d138a98c9f65566db0d4034031a: Status 404 returned error can't find the container with id 1bba61da3af66af332eca5265af04cc1b54d6d138a98c9f65566db0d4034031a Mar 20 15:23:04 crc kubenswrapper[4779]: W0320 15:23:04.338293 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-49c98f18a2bd64ba061b64f24e71015584c145c7747e929ce3e9f8c3eb10e4dc WatchSource:0}: Error finding container 49c98f18a2bd64ba061b64f24e71015584c145c7747e929ce3e9f8c3eb10e4dc: Status 404 returned error can't find the container with id 49c98f18a2bd64ba061b64f24e71015584c145c7747e929ce3e9f8c3eb10e4dc Mar 20 15:23:04 crc kubenswrapper[4779]: W0320 15:23:04.340224 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-eafa4709c78fe3f32cbb7d83091c9135c832297efb8f8fcce19ef9b8f93bf0b0 WatchSource:0}: Error finding container eafa4709c78fe3f32cbb7d83091c9135c832297efb8f8fcce19ef9b8f93bf0b0: Status 404 returned error can't find the container with id eafa4709c78fe3f32cbb7d83091c9135c832297efb8f8fcce19ef9b8f93bf0b0 Mar 20 15:23:04 crc kubenswrapper[4779]: W0320 15:23:04.342136 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-92e9db1f652ed9cf2c611cd7e9e39c49cbbb2afc7b791b74c7582485dd149890 WatchSource:0}: Error finding container 92e9db1f652ed9cf2c611cd7e9e39c49cbbb2afc7b791b74c7582485dd149890: Status 404 returned error can't find the container with id 92e9db1f652ed9cf2c611cd7e9e39c49cbbb2afc7b791b74c7582485dd149890 Mar 20 15:23:04 crc kubenswrapper[4779]: E0320 15:23:04.355781 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="800ms" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.610594 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.613065 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.613134 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.613150 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.613181 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:04 crc kubenswrapper[4779]: E0320 15:23:04.613756 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.219:6443: connect: connection refused" node="crc" Mar 20 15:23:04 crc kubenswrapper[4779]: W0320 15:23:04.622892 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:04 crc kubenswrapper[4779]: E0320 15:23:04.622993 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.752293 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:04 crc kubenswrapper[4779]: W0320 15:23:04.770021 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:04 crc kubenswrapper[4779]: E0320 15:23:04.770102 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.816601 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1bba61da3af66af332eca5265af04cc1b54d6d138a98c9f65566db0d4034031a"} Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.817799 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"92e9db1f652ed9cf2c611cd7e9e39c49cbbb2afc7b791b74c7582485dd149890"} Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.819089 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eafa4709c78fe3f32cbb7d83091c9135c832297efb8f8fcce19ef9b8f93bf0b0"} Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.820065 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"49c98f18a2bd64ba061b64f24e71015584c145c7747e929ce3e9f8c3eb10e4dc"} Mar 20 15:23:04 crc kubenswrapper[4779]: I0320 15:23:04.821215 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d5823b0e58237c0879a39184fb6a6ddfac9117f2ee8296011e26ebabf104206d"} Mar 20 15:23:04 crc kubenswrapper[4779]: W0320 15:23:04.909166 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:04 crc kubenswrapper[4779]: E0320 15:23:04.909269 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:05 crc kubenswrapper[4779]: W0320 15:23:05.129583 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:05 crc kubenswrapper[4779]: E0320 15:23:05.129672 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:05 crc kubenswrapper[4779]: E0320 15:23:05.157274 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="1.6s" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.414812 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.421153 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.421202 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.421214 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.421243 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:05 crc kubenswrapper[4779]: E0320 15:23:05.421967 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.219:6443: connect: connection refused" node="crc" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.631577 4779 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 15:23:05 crc kubenswrapper[4779]: E0320 15:23:05.632645 4779 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.219:6443: connect: connection refused" logger="UnhandledError" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.752027 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.825610 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059" exitCode=0 Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.825791 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.826228 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059"} Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.826610 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.826656 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.826668 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.827762 4779 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62" exitCode=0 Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.827798 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62"} Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.827864 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.828487 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.828517 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.828529 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.828572 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.829102 4779 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef" exitCode=0 Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.829185 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.829187 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef"} Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.830008 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.830021 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.830043 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.830043 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.830054 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.830063 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.831903 4779 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4" exitCode=0 Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.831952 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4"} Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.832024 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.832881 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.832917 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.832934 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.834813 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf"} Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.834852 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09"} Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.834863 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb"} Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.834871 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6"} Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.834930 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.842998 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.843029 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:05 crc kubenswrapper[4779]: I0320 15:23:05.843039 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.752451 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.219:6443: connect: connection refused Mar 20 15:23:06 crc kubenswrapper[4779]: E0320 15:23:06.758362 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="3.2s" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.840203 4779 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4" exitCode=0 Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.840274 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.840321 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.841723 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.841747 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.841756 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.844925 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.844953 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.844963 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.845034 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.845764 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.845786 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.845794 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.846983 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.847046 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.847590 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.847612 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.847620 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.850213 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.850556 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.850803 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1b138927e44c26de103d66c52320bf862d2550b83bc2faedbd42f2c963222027"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.850827 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.850837 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.850848 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.850860 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540"} Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.851127 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.851148 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.851157 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.851420 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.851469 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:06 crc kubenswrapper[4779]: I0320 15:23:06.851484 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.022946 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.024266 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.024321 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.024332 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.024370 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:07 crc kubenswrapper[4779]: E0320 15:23:07.024905 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.219:6443: connect: connection refused" node="crc" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.858661 4779 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad" exitCode=0 Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.858743 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad"} Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.858811 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.858823 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.858853 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.858872 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.858884 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.858930 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861378 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861437 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861462 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861371 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861503 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861515 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861556 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861525 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861573 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861464 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861744 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:07 crc kubenswrapper[4779]: I0320 15:23:07.861761 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.322442 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.435421 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.435749 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.437181 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.437256 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.437277 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.867300 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"24af932f21b1dbc1a2d96edd5cb5256eb0b4ddb672a001581f519bbb71b3f528"} Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.867342 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.867359 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f2a7bca00cbf9f2a1abbbcb015536c61a7fe6f26b37d289e182df0b2dd58d72a"} Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.867377 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cc1e37affb67c7652f578e79f104773967044884346ea2159f843bc27b53a324"} Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.867386 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.867390 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cee9526f89756d5aadafe6ce6c208353f3a24aa7126fd4e94c7154b5dee0b14f"} Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.868144 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.868196 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:08 crc kubenswrapper[4779]: I0320 15:23:08.868205 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.846267 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.875006 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9392d565f3b41698f93b7dab52770b90590d7a010c0f29a89c91bcdc16131803"} Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.875036 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.875073 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.876260 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.876302 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.876312 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.876337 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.876348 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.876314 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:09 crc kubenswrapper[4779]: I0320 15:23:09.893949 4779 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.225856 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.227626 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.227724 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.227744 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.227777 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.877321 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.878274 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.878306 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.878317 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.907982 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.908223 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.909290 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.909329 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:10 crc kubenswrapper[4779]: I0320 15:23:10.909338 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.154297 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.154525 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.155730 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.155765 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.155777 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.607633 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.881525 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.882775 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.882808 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:11 crc kubenswrapper[4779]: I0320 15:23:11.882818 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:12 crc kubenswrapper[4779]: I0320 15:23:12.998486 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:12 crc kubenswrapper[4779]: I0320 15:23:12.998661 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:12 crc kubenswrapper[4779]: I0320 15:23:12.999668 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:12 crc kubenswrapper[4779]: I0320 15:23:12.999706 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:12 crc kubenswrapper[4779]: I0320 15:23:12.999716 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:13 crc kubenswrapper[4779]: E0320 15:23:13.917258 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:23:13 crc kubenswrapper[4779]: I0320 15:23:13.979215 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:13 crc kubenswrapper[4779]: I0320 15:23:13.979483 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:13 crc kubenswrapper[4779]: I0320 15:23:13.981289 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:13 crc kubenswrapper[4779]: I0320 15:23:13.981329 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:13 crc kubenswrapper[4779]: I0320 15:23:13.981341 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:13 crc kubenswrapper[4779]: I0320 15:23:13.985267 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:14 crc kubenswrapper[4779]: I0320 15:23:14.889652 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:14 crc kubenswrapper[4779]: I0320 15:23:14.890631 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:14 crc kubenswrapper[4779]: I0320 15:23:14.890678 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:14 crc kubenswrapper[4779]: I0320 15:23:14.890692 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:14 crc kubenswrapper[4779]: I0320 15:23:14.898196 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:15 crc kubenswrapper[4779]: I0320 15:23:15.473693 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:15 crc kubenswrapper[4779]: I0320 15:23:15.892613 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:15 crc kubenswrapper[4779]: I0320 15:23:15.893573 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:15 crc kubenswrapper[4779]: I0320 15:23:15.893601 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:15 crc kubenswrapper[4779]: I0320 15:23:15.893613 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:16 crc kubenswrapper[4779]: I0320 15:23:16.895642 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:16 crc kubenswrapper[4779]: I0320 15:23:16.896816 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:16 crc kubenswrapper[4779]: I0320 15:23:16.896858 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:16 crc kubenswrapper[4779]: I0320 15:23:16.896870 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:17 crc kubenswrapper[4779]: W0320 15:23:17.315087 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.315203 4779 trace.go:236] Trace[260879259]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 15:23:07.313) (total time: 10001ms): Mar 20 15:23:17 crc kubenswrapper[4779]: Trace[260879259]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:23:17.315) Mar 20 15:23:17 crc kubenswrapper[4779]: Trace[260879259]: [10.00173874s] [10.00173874s] END Mar 20 15:23:17 crc kubenswrapper[4779]: E0320 15:23:17.315225 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 15:23:17 crc kubenswrapper[4779]: W0320 15:23:17.390295 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.390429 4779 trace.go:236] Trace[2026829613]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 15:23:07.388) (total time: 10002ms): Mar 20 15:23:17 crc kubenswrapper[4779]: Trace[2026829613]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (15:23:17.390) Mar 20 15:23:17 crc kubenswrapper[4779]: Trace[2026829613]: [10.00214636s] [10.00214636s] END Mar 20 15:23:17 crc kubenswrapper[4779]: E0320 15:23:17.390452 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.753407 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 15:23:17 crc kubenswrapper[4779]: W0320 15:23:17.804016 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.804168 4779 trace.go:236] Trace[1864069568]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 15:23:07.802) (total time: 10001ms): Mar 20 15:23:17 crc kubenswrapper[4779]: Trace[1864069568]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:23:17.804) Mar 20 15:23:17 crc kubenswrapper[4779]: Trace[1864069568]: [10.001666094s] [10.001666094s] END Mar 20 15:23:17 crc kubenswrapper[4779]: E0320 15:23:17.804192 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.898799 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.900316 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1b138927e44c26de103d66c52320bf862d2550b83bc2faedbd42f2c963222027" exitCode=255 Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.900363 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1b138927e44c26de103d66c52320bf862d2550b83bc2faedbd42f2c963222027"} Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.900512 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.901358 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.901413 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.901430 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.902395 4779 scope.go:117] "RemoveContainer" containerID="1b138927e44c26de103d66c52320bf862d2550b83bc2faedbd42f2c963222027" Mar 20 15:23:17 crc kubenswrapper[4779]: W0320 15:23:17.903202 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 15:23:17 crc kubenswrapper[4779]: I0320 15:23:17.903301 4779 trace.go:236] Trace[1105779281]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 15:23:07.901) (total time: 10002ms): Mar 20 15:23:17 crc kubenswrapper[4779]: Trace[1105779281]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:23:17.903) Mar 20 15:23:17 crc kubenswrapper[4779]: Trace[1105779281]: [10.002068766s] [10.002068766s] END Mar 20 15:23:17 crc kubenswrapper[4779]: E0320 15:23:17.903324 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.099750 4779 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.099819 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 15:23:18 crc kubenswrapper[4779]: E0320 15:23:18.111380 4779 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:18Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e95f53c544b98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.742237592 +0000 UTC m=+0.704753422,LastTimestamp:2026-03-20 15:23:03.742237592 +0000 UTC m=+0.704753422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.112261 4779 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.112318 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 15:23:18 crc kubenswrapper[4779]: E0320 15:23:18.113519 4779 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:18 crc kubenswrapper[4779]: E0320 15:23:18.115587 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:18Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 15:23:18 crc kubenswrapper[4779]: E0320 15:23:18.118100 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:18Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.139460 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.139703 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.140868 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.140891 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.140901 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.151347 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.180635 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.328492 4779 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]log ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]etcd ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-apiextensions-informers ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-apiextensions-controllers ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/crd-informer-synced ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 15:23:18 crc kubenswrapper[4779]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 20 15:23:18 crc kubenswrapper[4779]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/bootstrap-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/apiservice-registration-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/apiservice-discovery-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]autoregister-completion ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 15:23:18 crc kubenswrapper[4779]: livez check failed Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.328560 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.473708 4779 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.473787 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.755164 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:18Z is after 2026-02-23T05:33:13Z Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.904077 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.905567 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8"} Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.905611 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.905630 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.906693 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.906739 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.906756 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.906908 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.906942 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.906957 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:18 crc kubenswrapper[4779]: I0320 15:23:18.924767 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.755014 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:19Z is after 2026-02-23T05:33:13Z Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.846529 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.910100 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.910778 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.912410 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8" exitCode=255 Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.912488 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8"} Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.912546 4779 scope.go:117] "RemoveContainer" containerID="1b138927e44c26de103d66c52320bf862d2550b83bc2faedbd42f2c963222027" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.912649 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.912678 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.913958 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.914016 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.914041 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.914973 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.915002 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.915029 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:19 crc kubenswrapper[4779]: I0320 15:23:19.915175 4779 scope.go:117] "RemoveContainer" containerID="7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8" Mar 20 15:23:19 crc kubenswrapper[4779]: E0320 15:23:19.915575 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:23:20 crc kubenswrapper[4779]: I0320 15:23:20.754287 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:20Z is after 2026-02-23T05:33:13Z Mar 20 15:23:20 crc kubenswrapper[4779]: I0320 15:23:20.917174 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 15:23:20 crc kubenswrapper[4779]: I0320 15:23:20.919592 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:20 crc kubenswrapper[4779]: I0320 15:23:20.920807 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:20 crc kubenswrapper[4779]: I0320 15:23:20.920878 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:20 crc kubenswrapper[4779]: I0320 15:23:20.920903 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:20 crc kubenswrapper[4779]: I0320 15:23:20.921868 4779 scope.go:117] "RemoveContainer" containerID="7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8" Mar 20 15:23:20 crc kubenswrapper[4779]: E0320 15:23:20.922210 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:23:21 crc kubenswrapper[4779]: W0320 15:23:21.297423 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:21Z is after 2026-02-23T05:33:13Z Mar 20 15:23:21 crc kubenswrapper[4779]: E0320 15:23:21.297515 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:21 crc kubenswrapper[4779]: W0320 15:23:21.327397 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:21Z is after 2026-02-23T05:33:13Z Mar 20 15:23:21 crc kubenswrapper[4779]: E0320 15:23:21.327478 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:21 crc kubenswrapper[4779]: W0320 15:23:21.360537 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:21Z is after 2026-02-23T05:33:13Z Mar 20 15:23:21 crc kubenswrapper[4779]: E0320 15:23:21.360614 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:21 crc kubenswrapper[4779]: I0320 15:23:21.754983 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:21Z is after 2026-02-23T05:33:13Z Mar 20 15:23:22 crc kubenswrapper[4779]: I0320 15:23:22.755379 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:22Z is after 2026-02-23T05:33:13Z Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.332173 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.332325 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.333559 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.333604 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.333612 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.334188 4779 scope.go:117] "RemoveContainer" containerID="7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8" Mar 20 15:23:23 crc kubenswrapper[4779]: E0320 15:23:23.334349 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.339382 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.756456 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:23Z is after 2026-02-23T05:33:13Z Mar 20 15:23:23 crc kubenswrapper[4779]: W0320 15:23:23.864827 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:23Z is after 2026-02-23T05:33:13Z Mar 20 15:23:23 crc kubenswrapper[4779]: E0320 15:23:23.864953 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:23 crc kubenswrapper[4779]: E0320 15:23:23.917468 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.925682 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.929611 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.929720 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.929745 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:23 crc kubenswrapper[4779]: I0320 15:23:23.931444 4779 scope.go:117] "RemoveContainer" containerID="7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8" Mar 20 15:23:23 crc kubenswrapper[4779]: E0320 15:23:23.932007 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:23:24 crc kubenswrapper[4779]: I0320 15:23:24.519225 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:24 crc kubenswrapper[4779]: E0320 15:23:24.519697 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:24Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 15:23:24 crc kubenswrapper[4779]: I0320 15:23:24.520720 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:24 crc kubenswrapper[4779]: I0320 15:23:24.520789 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:24 crc kubenswrapper[4779]: I0320 15:23:24.520801 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:24 crc kubenswrapper[4779]: I0320 15:23:24.520824 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:24 crc kubenswrapper[4779]: E0320 15:23:24.523309 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:24Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 15:23:24 crc kubenswrapper[4779]: I0320 15:23:24.755393 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:24Z is after 2026-02-23T05:33:13Z Mar 20 15:23:25 crc kubenswrapper[4779]: I0320 15:23:25.754545 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:25Z is after 2026-02-23T05:33:13Z Mar 20 15:23:26 crc kubenswrapper[4779]: I0320 15:23:26.715660 4779 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 15:23:26 crc kubenswrapper[4779]: E0320 15:23:26.719024 4779 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:26 crc kubenswrapper[4779]: I0320 15:23:26.757189 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:26Z is after 2026-02-23T05:33:13Z Mar 20 15:23:27 crc kubenswrapper[4779]: I0320 15:23:27.756143 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:27Z is after 2026-02-23T05:33:13Z Mar 20 15:23:28 crc kubenswrapper[4779]: E0320 15:23:28.115938 4779 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:28Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e95f53c544b98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.742237592 +0000 UTC m=+0.704753422,LastTimestamp:2026-03-20 15:23:03.742237592 +0000 UTC m=+0.704753422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.151408 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.151734 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.153937 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.154026 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.154047 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.155404 4779 scope.go:117] "RemoveContainer" containerID="7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8" Mar 20 15:23:28 crc kubenswrapper[4779]: E0320 15:23:28.155754 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:23:28 crc kubenswrapper[4779]: W0320 15:23:28.321949 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:28Z is after 2026-02-23T05:33:13Z Mar 20 15:23:28 crc kubenswrapper[4779]: E0320 15:23:28.322163 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.474309 4779 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.474456 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 15:23:28 crc kubenswrapper[4779]: I0320 15:23:28.758395 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:28Z is after 2026-02-23T05:33:13Z Mar 20 15:23:28 crc kubenswrapper[4779]: W0320 15:23:28.922779 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:28Z is after 2026-02-23T05:33:13Z Mar 20 15:23:28 crc kubenswrapper[4779]: E0320 15:23:28.922914 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:29 crc kubenswrapper[4779]: I0320 15:23:29.755569 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:29Z is after 2026-02-23T05:33:13Z Mar 20 15:23:30 crc kubenswrapper[4779]: I0320 15:23:30.757907 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:30Z is after 2026-02-23T05:33:13Z Mar 20 15:23:31 crc kubenswrapper[4779]: I0320 15:23:31.523493 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:31 crc kubenswrapper[4779]: I0320 15:23:31.525273 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:31 crc kubenswrapper[4779]: I0320 15:23:31.525327 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:31 crc kubenswrapper[4779]: I0320 15:23:31.525388 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:31 crc kubenswrapper[4779]: I0320 15:23:31.525426 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:31 crc kubenswrapper[4779]: E0320 15:23:31.525770 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 15:23:31 crc kubenswrapper[4779]: E0320 15:23:31.530684 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 15:23:31 crc kubenswrapper[4779]: W0320 15:23:31.568251 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:31Z is after 2026-02-23T05:33:13Z Mar 20 15:23:31 crc kubenswrapper[4779]: E0320 15:23:31.568310 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:31 crc kubenswrapper[4779]: I0320 15:23:31.754613 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:31Z is after 2026-02-23T05:33:13Z Mar 20 15:23:32 crc kubenswrapper[4779]: I0320 15:23:32.758399 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:32Z is after 2026-02-23T05:33:13Z Mar 20 15:23:32 crc kubenswrapper[4779]: W0320 15:23:32.786170 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:32Z is after 2026-02-23T05:33:13Z Mar 20 15:23:32 crc kubenswrapper[4779]: E0320 15:23:32.786293 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 15:23:33 crc kubenswrapper[4779]: I0320 15:23:33.757848 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:33Z is after 2026-02-23T05:33:13Z Mar 20 15:23:33 crc kubenswrapper[4779]: E0320 15:23:33.917699 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:23:34 crc kubenswrapper[4779]: I0320 15:23:34.755185 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:34Z is after 2026-02-23T05:33:13Z Mar 20 15:23:35 crc kubenswrapper[4779]: I0320 15:23:35.756695 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.095093 4779 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44420->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.095203 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44420->192.168.126.11:10357: read: connection reset by peer" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.095265 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.095414 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.096552 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.096590 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.096600 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.097057 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.097214 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb" gracePeriod=30 Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.758890 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.961761 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.962356 4779 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb" exitCode=255 Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.962405 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb"} Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.962433 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012"} Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.962537 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.963741 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.963801 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:36 crc kubenswrapper[4779]: I0320 15:23:36.963817 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:37 crc kubenswrapper[4779]: I0320 15:23:37.759916 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.124629 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f53c544b98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.742237592 +0000 UTC m=+0.704753422,LastTimestamp:2026-03-20 15:23:03.742237592 +0000 UTC m=+0.704753422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.129059 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f540911524 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,LastTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.133281 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f54091474a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,LastTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.137377 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f5409165e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,LastTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.143377 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f540911524\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f540911524 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,LastTimestamp:2026-03-20 15:23:03.908921019 +0000 UTC m=+0.871436819,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.149697 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f54091474a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f54091474a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,LastTimestamp:2026-03-20 15:23:03.908945475 +0000 UTC m=+0.871461275,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.155485 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f5409165e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f5409165e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,LastTimestamp:2026-03-20 15:23:03.908955264 +0000 UTC m=+0.871471064,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.159459 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f540911524\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f540911524 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,LastTimestamp:2026-03-20 15:23:03.910278097 +0000 UTC m=+0.872793897,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.165754 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f54091474a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f54091474a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,LastTimestamp:2026-03-20 15:23:03.91029302 +0000 UTC m=+0.872808820,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.170054 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f5409165e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f5409165e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,LastTimestamp:2026-03-20 15:23:03.910300742 +0000 UTC m=+0.872816542,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.174100 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f540911524\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f540911524 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,LastTimestamp:2026-03-20 15:23:03.915605237 +0000 UTC m=+0.878121037,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.178034 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f54091474a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f54091474a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,LastTimestamp:2026-03-20 15:23:03.915627122 +0000 UTC m=+0.878142922,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.183639 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f5409165e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f5409165e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,LastTimestamp:2026-03-20 15:23:03.915638705 +0000 UTC m=+0.878154505,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.189783 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f540911524\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f540911524 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,LastTimestamp:2026-03-20 15:23:03.915790862 +0000 UTC m=+0.878306682,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.196515 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f54091474a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f54091474a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,LastTimestamp:2026-03-20 15:23:03.915806336 +0000 UTC m=+0.878322146,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.204088 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f5409165e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f5409165e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,LastTimestamp:2026-03-20 15:23:03.915841484 +0000 UTC m=+0.878357294,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.211984 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f540911524\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f540911524 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,LastTimestamp:2026-03-20 15:23:03.915892787 +0000 UTC m=+0.878408587,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.217379 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f54091474a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f54091474a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,LastTimestamp:2026-03-20 15:23:03.915908111 +0000 UTC m=+0.878423911,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.223891 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f5409165e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f5409165e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,LastTimestamp:2026-03-20 15:23:03.915917163 +0000 UTC m=+0.878432963,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.228990 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f540911524\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f540911524 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,LastTimestamp:2026-03-20 15:23:03.916720229 +0000 UTC m=+0.879236029,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.234421 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f54091474a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f54091474a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,LastTimestamp:2026-03-20 15:23:03.916736363 +0000 UTC m=+0.879252163,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.239245 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f5409165e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f5409165e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,LastTimestamp:2026-03-20 15:23:03.916744885 +0000 UTC m=+0.879260685,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.244228 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f540911524\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f540911524 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813330212 +0000 UTC m=+0.775846012,LastTimestamp:2026-03-20 15:23:03.916855362 +0000 UTC m=+0.879371172,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.248477 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f54091474a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f54091474a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.81334305 +0000 UTC m=+0.775858850,LastTimestamp:2026-03-20 15:23:03.916875997 +0000 UTC m=+0.879391807,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.255160 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e95f5409165e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e95f5409165e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:03.813350889 +0000 UTC m=+0.775866689,LastTimestamp:2026-03-20 15:23:03.916908425 +0000 UTC m=+0.879424235,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.261121 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f55ff9baae openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.34028203 +0000 UTC m=+1.302797850,LastTimestamp:2026-03-20 15:23:04.34028203 +0000 UTC m=+1.302797850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.264856 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e95f55ffa0e37 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.340303415 +0000 UTC m=+1.302819215,LastTimestamp:2026-03-20 15:23:04.340303415 +0000 UTC m=+1.302819215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.268316 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f5600585d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.341054929 +0000 UTC m=+1.303570729,LastTimestamp:2026-03-20 15:23:04.341054929 +0000 UTC m=+1.303570729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.271808 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f5602a0941 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.343447873 +0000 UTC m=+1.305963673,LastTimestamp:2026-03-20 15:23:04.343447873 +0000 UTC m=+1.305963673,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.275881 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f56064a7a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.34728951 +0000 UTC m=+1.309805310,LastTimestamp:2026-03-20 15:23:04.34728951 +0000 UTC m=+1.309805310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.279615 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f57fdfbb6f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.875449199 +0000 UTC m=+1.837965009,LastTimestamp:2026-03-20 15:23:04.875449199 +0000 UTC m=+1.837965009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.283763 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e95f57ffc3a0c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.87731662 +0000 UTC m=+1.839832420,LastTimestamp:2026-03-20 15:23:04.87731662 +0000 UTC m=+1.839832420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.287283 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f58009038d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.878154637 +0000 UTC m=+1.840670437,LastTimestamp:2026-03-20 15:23:04.878154637 +0000 UTC m=+1.840670437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.291053 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5801d9a2c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.879503916 +0000 UTC m=+1.842019716,LastTimestamp:2026-03-20 15:23:04.879503916 +0000 UTC m=+1.842019716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.294813 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f5806550bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.884203708 +0000 UTC m=+1.846719498,LastTimestamp:2026-03-20 15:23:04.884203708 +0000 UTC m=+1.846719498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.299316 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f58080b428 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.885998632 +0000 UTC m=+1.848514432,LastTimestamp:2026-03-20 15:23:04.885998632 +0000 UTC m=+1.848514432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.303042 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f58097714c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.887488844 +0000 UTC m=+1.850004644,LastTimestamp:2026-03-20 15:23:04.887488844 +0000 UTC m=+1.850004644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.306863 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f580ca8476 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.890836086 +0000 UTC m=+1.853351886,LastTimestamp:2026-03-20 15:23:04.890836086 +0000 UTC m=+1.853351886,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.311437 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f580e46905 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.892532997 +0000 UTC m=+1.855048797,LastTimestamp:2026-03-20 15:23:04.892532997 +0000 UTC m=+1.855048797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.314854 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e95f580e82600 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.892777984 +0000 UTC m=+1.855293794,LastTimestamp:2026-03-20 15:23:04.892777984 +0000 UTC m=+1.855293794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.316266 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f5816acc4e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.901340238 +0000 UTC m=+1.863856038,LastTimestamp:2026-03-20 15:23:04.901340238 +0000 UTC m=+1.863856038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.318312 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f58fd68aed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.143282413 +0000 UTC m=+2.105798213,LastTimestamp:2026-03-20 15:23:05.143282413 +0000 UTC m=+2.105798213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.319527 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f59047dd7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.150709118 +0000 UTC m=+2.113224918,LastTimestamp:2026-03-20 15:23:05.150709118 +0000 UTC m=+2.113224918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.322801 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f590585808 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.151789064 +0000 UTC m=+2.114304864,LastTimestamp:2026-03-20 15:23:05.151789064 +0000 UTC m=+2.114304864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.326082 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f598acd879 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.291544697 +0000 UTC m=+2.254060497,LastTimestamp:2026-03-20 15:23:05.291544697 +0000 UTC m=+2.254060497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.329630 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f59935d71a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.300522778 +0000 UTC m=+2.263038578,LastTimestamp:2026-03-20 15:23:05.300522778 +0000 UTC m=+2.263038578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.333876 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f5994bf3a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.301971881 +0000 UTC m=+2.264487681,LastTimestamp:2026-03-20 15:23:05.301971881 +0000 UTC m=+2.264487681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.337642 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f5a223f5f2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.45034597 +0000 UTC m=+2.412861780,LastTimestamp:2026-03-20 15:23:05.45034597 +0000 UTC m=+2.412861780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.341363 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f5a2f21800 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.463855104 +0000 UTC m=+2.426370904,LastTimestamp:2026-03-20 15:23:05.463855104 +0000 UTC m=+2.426370904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.345348 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5b8a948a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.828182176 +0000 UTC m=+2.790697976,LastTimestamp:2026-03-20 15:23:05.828182176 +0000 UTC m=+2.790697976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.349295 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f5b8bed348 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.829593928 +0000 UTC m=+2.792109738,LastTimestamp:2026-03-20 15:23:05.829593928 +0000 UTC m=+2.792109738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.353716 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5b8d86a2b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.831270955 +0000 UTC m=+2.793786755,LastTimestamp:2026-03-20 15:23:05.831270955 +0000 UTC m=+2.793786755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.356983 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e95f5b974b395 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.841513365 +0000 UTC m=+2.804029175,LastTimestamp:2026-03-20 15:23:05.841513365 +0000 UTC m=+2.804029175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.360487 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5c5298ec1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.037915329 +0000 UTC m=+3.000431129,LastTimestamp:2026-03-20 15:23:06.037915329 +0000 UTC m=+3.000431129,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.363772 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f5c54fc85c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.040420444 +0000 UTC m=+3.002936244,LastTimestamp:2026-03-20 15:23:06.040420444 +0000 UTC m=+3.002936244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.367228 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e95f5c5535874 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.04065394 +0000 UTC m=+3.003169740,LastTimestamp:2026-03-20 15:23:06.04065394 +0000 UTC m=+3.003169740,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.370581 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5c5543c46 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.040712262 +0000 UTC m=+3.003228062,LastTimestamp:2026-03-20 15:23:06.040712262 +0000 UTC m=+3.003228062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.373695 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5c5bc3fa1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.047528865 +0000 UTC m=+3.010044665,LastTimestamp:2026-03-20 15:23:06.047528865 +0000 UTC m=+3.010044665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.377092 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5c5cc421e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.048578078 +0000 UTC m=+3.011093878,LastTimestamp:2026-03-20 15:23:06.048578078 +0000 UTC m=+3.011093878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.380137 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e95f5c6942ff9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.061680633 +0000 UTC m=+3.024196433,LastTimestamp:2026-03-20 15:23:06.061680633 +0000 UTC m=+3.024196433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.383375 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f5c6c15b0b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.064640779 +0000 UTC m=+3.027156579,LastTimestamp:2026-03-20 15:23:06.064640779 +0000 UTC m=+3.027156579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.386429 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5c6c1998c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.06465678 +0000 UTC m=+3.027172580,LastTimestamp:2026-03-20 15:23:06.06465678 +0000 UTC m=+3.027172580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.389540 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5c6cfad60 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.06557936 +0000 UTC m=+3.028095160,LastTimestamp:2026-03-20 15:23:06.06557936 +0000 UTC m=+3.028095160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.392705 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5cf32e4ba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.206299322 +0000 UTC m=+3.168815122,LastTimestamp:2026-03-20 15:23:06.206299322 +0000 UTC m=+3.168815122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.395923 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5cf50da8c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.208262796 +0000 UTC m=+3.170778596,LastTimestamp:2026-03-20 15:23:06.208262796 +0000 UTC m=+3.170778596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.399636 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5cfe64644 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.218055236 +0000 UTC m=+3.180571036,LastTimestamp:2026-03-20 15:23:06.218055236 +0000 UTC m=+3.180571036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.402888 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5cff3c814 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.218940436 +0000 UTC m=+3.181456236,LastTimestamp:2026-03-20 15:23:06.218940436 +0000 UTC m=+3.181456236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.406602 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5d037314d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.223358285 +0000 UTC m=+3.185874085,LastTimestamp:2026-03-20 15:23:06.223358285 +0000 UTC m=+3.185874085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.409765 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5d06b5236 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.226774582 +0000 UTC m=+3.189290382,LastTimestamp:2026-03-20 15:23:06.226774582 +0000 UTC m=+3.189290382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.413919 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5d9361210 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.374279696 +0000 UTC m=+3.336795496,LastTimestamp:2026-03-20 15:23:06.374279696 +0000 UTC m=+3.336795496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.417411 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5d96b3916 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.377763094 +0000 UTC m=+3.340278894,LastTimestamp:2026-03-20 15:23:06.377763094 +0000 UTC m=+3.340278894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.420794 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e95f5d9f7c293 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.386973331 +0000 UTC m=+3.349489131,LastTimestamp:2026-03-20 15:23:06.386973331 +0000 UTC m=+3.349489131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.424013 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5da2ab900 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.390313216 +0000 UTC m=+3.352829016,LastTimestamp:2026-03-20 15:23:06.390313216 +0000 UTC m=+3.352829016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.427040 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5da3974a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.391278758 +0000 UTC m=+3.353794548,LastTimestamp:2026-03-20 15:23:06.391278758 +0000 UTC m=+3.353794548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.430817 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5e50f6f26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.573074214 +0000 UTC m=+3.535590014,LastTimestamp:2026-03-20 15:23:06.573074214 +0000 UTC m=+3.535590014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.434262 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5e5b734e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.58406935 +0000 UTC m=+3.546585150,LastTimestamp:2026-03-20 15:23:06.58406935 +0000 UTC m=+3.546585150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.438973 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5e5c6ec42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.58509933 +0000 UTC m=+3.547615130,LastTimestamp:2026-03-20 15:23:06.58509933 +0000 UTC m=+3.547615130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.442309 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5f1b27a58 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.78508604 +0000 UTC m=+3.747601840,LastTimestamp:2026-03-20 15:23:06.78508604 +0000 UTC m=+3.747601840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.446213 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5f2311ecc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.793385676 +0000 UTC m=+3.755901466,LastTimestamp:2026-03-20 15:23:06.793385676 +0000 UTC m=+3.755901466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.451185 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f5f528615e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.843144542 +0000 UTC m=+3.805660342,LastTimestamp:2026-03-20 15:23:06.843144542 +0000 UTC m=+3.805660342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.456398 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f600aa3dbc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:07.036204476 +0000 UTC m=+3.998720276,LastTimestamp:2026-03-20 15:23:07.036204476 +0000 UTC m=+3.998720276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.460880 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f60147435f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:07.046495071 +0000 UTC m=+4.009010871,LastTimestamp:2026-03-20 15:23:07.046495071 +0000 UTC m=+4.009010871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.466086 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f6323edccd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:07.868028109 +0000 UTC m=+4.830543909,LastTimestamp:2026-03-20 15:23:07.868028109 +0000 UTC m=+4.830543909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.469577 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f6406b71e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.10583088 +0000 UTC m=+5.068346720,LastTimestamp:2026-03-20 15:23:08.10583088 +0000 UTC m=+5.068346720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.473224 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f6411576eb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.116973291 +0000 UTC m=+5.079489091,LastTimestamp:2026-03-20 15:23:08.116973291 +0000 UTC m=+5.079489091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.477895 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f641276f35 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.118150965 +0000 UTC m=+5.080666765,LastTimestamp:2026-03-20 15:23:08.118150965 +0000 UTC m=+5.080666765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.482285 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f64f3d5442 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.354466882 +0000 UTC m=+5.316982682,LastTimestamp:2026-03-20 15:23:08.354466882 +0000 UTC m=+5.316982682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.486386 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f65006ebb0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.367678384 +0000 UTC m=+5.330194184,LastTimestamp:2026-03-20 15:23:08.367678384 +0000 UTC m=+5.330194184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.489821 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f65020018f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.369322383 +0000 UTC m=+5.331838193,LastTimestamp:2026-03-20 15:23:08.369322383 +0000 UTC m=+5.331838193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.494058 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f65cdf10ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.583170254 +0000 UTC m=+5.545686054,LastTimestamp:2026-03-20 15:23:08.583170254 +0000 UTC m=+5.545686054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.497376 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f65dc91cc1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.598508737 +0000 UTC m=+5.561024557,LastTimestamp:2026-03-20 15:23:08.598508737 +0000 UTC m=+5.561024557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.500799 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f65ddb3e6b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.599697003 +0000 UTC m=+5.562212813,LastTimestamp:2026-03-20 15:23:08.599697003 +0000 UTC m=+5.562212813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.504659 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f66a9cb576 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.813702518 +0000 UTC m=+5.776218358,LastTimestamp:2026-03-20 15:23:08.813702518 +0000 UTC m=+5.776218358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.508414 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f66b7de2ec openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.828459756 +0000 UTC m=+5.790975596,LastTimestamp:2026-03-20 15:23:08.828459756 +0000 UTC m=+5.790975596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.511921 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f66b9441a8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:08.8299258 +0000 UTC m=+5.792441610,LastTimestamp:2026-03-20 15:23:08.8299258 +0000 UTC m=+5.792441610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.515448 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f677ecef5f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:09.037064031 +0000 UTC m=+5.999579861,LastTimestamp:2026-03-20 15:23:09.037064031 +0000 UTC m=+5.999579861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.519726 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e95f67878a82a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:09.046220842 +0000 UTC m=+6.008736652,LastTimestamp:2026-03-20 15:23:09.046220842 +0000 UTC m=+6.008736652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.522404 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e95f5e5c6ec42\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5e5c6ec42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.58509933 +0000 UTC m=+3.547615130,LastTimestamp:2026-03-20 15:23:17.903603006 +0000 UTC m=+14.866118816,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.526778 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.526822 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e95f5f1b27a58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5f1b27a58 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.78508604 +0000 UTC m=+3.747601840,LastTimestamp:2026-03-20 15:23:18.074697254 +0000 UTC m=+15.037213074,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.530273 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e95f5f2311ecc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f5f2311ecc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:06.793385676 +0000 UTC m=+3.755901466,LastTimestamp:2026-03-20 15:23:18.081858716 +0000 UTC m=+15.044374536,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: I0320 15:23:38.531673 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:38 crc kubenswrapper[4779]: I0320 15:23:38.532946 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:38 crc kubenswrapper[4779]: I0320 15:23:38.532972 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:38 crc kubenswrapper[4779]: I0320 15:23:38.532981 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:38 crc kubenswrapper[4779]: I0320 15:23:38.533004 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.534951 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.535172 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 15:23:38 crc kubenswrapper[4779]: &Event{ObjectMeta:{kube-apiserver-crc.189e95f8941b5c1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 15:23:38 crc kubenswrapper[4779]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 15:23:38 crc kubenswrapper[4779]: Mar 20 15:23:38 crc kubenswrapper[4779]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:18.099803164 +0000 UTC m=+15.062318974,LastTimestamp:2026-03-20 15:23:18.099803164 +0000 UTC m=+15.062318974,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 15:23:38 crc kubenswrapper[4779]: > Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.537166 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f8941c0df7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:18.099848695 +0000 UTC m=+15.062364505,LastTimestamp:2026-03-20 15:23:18.099848695 +0000 UTC m=+15.062364505,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.538595 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e95f8941b5c1c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 15:23:38 crc kubenswrapper[4779]: &Event{ObjectMeta:{kube-apiserver-crc.189e95f8941b5c1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 15:23:38 crc kubenswrapper[4779]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 15:23:38 crc kubenswrapper[4779]: Mar 20 15:23:38 crc kubenswrapper[4779]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:18.099803164 +0000 UTC m=+15.062318974,LastTimestamp:2026-03-20 15:23:18.112302582 +0000 UTC m=+15.074818392,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 15:23:38 crc kubenswrapper[4779]: > Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.541915 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e95f8941c0df7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e95f8941c0df7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:18.099848695 +0000 UTC m=+15.062364505,LastTimestamp:2026-03-20 15:23:18.112340903 +0000 UTC m=+15.074856713,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.546909 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 15:23:38 crc kubenswrapper[4779]: &Event{ObjectMeta:{kube-controller-manager-crc.189e95f8aa65a672 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 15:23:38 crc kubenswrapper[4779]: body: Mar 20 15:23:38 crc kubenswrapper[4779]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:18.47377061 +0000 UTC m=+15.436286410,LastTimestamp:2026-03-20 15:23:18.47377061 +0000 UTC m=+15.436286410,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 15:23:38 crc kubenswrapper[4779]: > Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.551154 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f8aa68180c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:18.473930764 +0000 UTC m=+15.436446564,LastTimestamp:2026-03-20 15:23:18.473930764 +0000 UTC m=+15.436446564,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.557872 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e95f8aa65a672\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 15:23:38 crc kubenswrapper[4779]: &Event{ObjectMeta:{kube-controller-manager-crc.189e95f8aa65a672 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 15:23:38 crc kubenswrapper[4779]: body: Mar 20 15:23:38 crc kubenswrapper[4779]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:18.47377061 +0000 UTC m=+15.436286410,LastTimestamp:2026-03-20 15:23:28.474425483 +0000 UTC m=+25.436941303,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 15:23:38 crc kubenswrapper[4779]: > Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.561293 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e95f8aa68180c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f8aa68180c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:18.473930764 +0000 UTC m=+15.436446564,LastTimestamp:2026-03-20 15:23:28.474499365 +0000 UTC m=+25.437015165,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.565208 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 15:23:38 crc kubenswrapper[4779]: &Event{ObjectMeta:{kube-controller-manager-crc.189e95fcc4b70d01 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:44420->192.168.126.11:10357: read: connection reset by peer Mar 20 15:23:38 crc kubenswrapper[4779]: body: Mar 20 15:23:38 crc kubenswrapper[4779]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:36.095182081 +0000 UTC m=+33.057697901,LastTimestamp:2026-03-20 15:23:36.095182081 +0000 UTC m=+33.057697901,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 15:23:38 crc kubenswrapper[4779]: > Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.568458 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95fcc4b7dd18 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44420->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:36.095235352 +0000 UTC m=+33.057751162,LastTimestamp:2026-03-20 15:23:36.095235352 +0000 UTC m=+33.057751162,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.576059 4779 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95fcc4d5e2fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:36.097202939 +0000 UTC m=+33.059718739,LastTimestamp:2026-03-20 15:23:36.097202939 +0000 UTC m=+33.059718739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.580025 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e95f58097714c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f58097714c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:04.887488844 +0000 UTC m=+1.850004644,LastTimestamp:2026-03-20 15:23:36.115094227 +0000 UTC m=+33.077610027,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.583761 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e95f58fd68aed\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f58fd68aed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.143282413 +0000 UTC m=+2.105798213,LastTimestamp:2026-03-20 15:23:36.290997 +0000 UTC m=+33.253512800,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: E0320 15:23:38.587660 4779 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e95f59047dd7e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e95f59047dd7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:23:05.150709118 +0000 UTC m=+2.113224918,LastTimestamp:2026-03-20 15:23:36.303695793 +0000 UTC m=+33.266211623,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:23:38 crc kubenswrapper[4779]: I0320 15:23:38.755881 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:39 crc kubenswrapper[4779]: I0320 15:23:39.759147 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:40 crc kubenswrapper[4779]: I0320 15:23:40.758883 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:40 crc kubenswrapper[4779]: I0320 15:23:40.909076 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:40 crc kubenswrapper[4779]: I0320 15:23:40.909263 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:40 crc kubenswrapper[4779]: I0320 15:23:40.910334 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:40 crc kubenswrapper[4779]: I0320 15:23:40.910396 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:40 crc kubenswrapper[4779]: I0320 15:23:40.910420 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:41 crc kubenswrapper[4779]: I0320 15:23:41.755172 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:41 crc kubenswrapper[4779]: I0320 15:23:41.808718 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:41 crc kubenswrapper[4779]: I0320 15:23:41.809930 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:41 crc kubenswrapper[4779]: I0320 15:23:41.809971 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:41 crc kubenswrapper[4779]: I0320 15:23:41.809981 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:41 crc kubenswrapper[4779]: I0320 15:23:41.810468 4779 scope.go:117] "RemoveContainer" containerID="7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8" Mar 20 15:23:42 crc kubenswrapper[4779]: I0320 15:23:42.757491 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:42 crc kubenswrapper[4779]: I0320 15:23:42.976852 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 15:23:42 crc kubenswrapper[4779]: I0320 15:23:42.978648 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"04e6d6fdb9955a260e69e338e4a4bb9be59f7c958fad14af868848340ea1b039"} Mar 20 15:23:42 crc kubenswrapper[4779]: I0320 15:23:42.978802 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:42 crc kubenswrapper[4779]: I0320 15:23:42.979550 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:42 crc kubenswrapper[4779]: I0320 15:23:42.979583 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:42 crc kubenswrapper[4779]: I0320 15:23:42.979594 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.512377 4779 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.524977 4779 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.758654 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:43 crc kubenswrapper[4779]: E0320 15:23:43.918817 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.985393 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.986353 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.989471 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="04e6d6fdb9955a260e69e338e4a4bb9be59f7c958fad14af868848340ea1b039" exitCode=255 Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.989545 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"04e6d6fdb9955a260e69e338e4a4bb9be59f7c958fad14af868848340ea1b039"} Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.989621 4779 scope.go:117] "RemoveContainer" containerID="7e7e831cc817d4cb7ba73b640cea5237b74fd22093c9d891ca4df67ccb893ef8" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.989816 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.991277 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.991331 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.991349 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:43 crc kubenswrapper[4779]: I0320 15:23:43.992203 4779 scope.go:117] "RemoveContainer" containerID="04e6d6fdb9955a260e69e338e4a4bb9be59f7c958fad14af868848340ea1b039" Mar 20 15:23:43 crc kubenswrapper[4779]: E0320 15:23:43.992591 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:23:44 crc kubenswrapper[4779]: I0320 15:23:44.757408 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:44 crc kubenswrapper[4779]: I0320 15:23:44.993254 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 15:23:45 crc kubenswrapper[4779]: W0320 15:23:45.260433 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:45 crc kubenswrapper[4779]: E0320 15:23:45.260534 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.474318 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.474679 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.476235 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.476343 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.476424 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:45 crc kubenswrapper[4779]: E0320 15:23:45.531802 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.535271 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.536347 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.536428 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.536456 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.536503 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:45 crc kubenswrapper[4779]: E0320 15:23:45.540562 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.757778 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.806985 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.998083 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.998963 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.999035 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:45 crc kubenswrapper[4779]: I0320 15:23:45.999054 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:46 crc kubenswrapper[4779]: W0320 15:23:46.718150 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 15:23:46 crc kubenswrapper[4779]: E0320 15:23:46.718213 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 15:23:46 crc kubenswrapper[4779]: I0320 15:23:46.756809 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:47 crc kubenswrapper[4779]: W0320 15:23:47.052194 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 15:23:47 crc kubenswrapper[4779]: E0320 15:23:47.052247 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 15:23:47 crc kubenswrapper[4779]: I0320 15:23:47.759799 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:48 crc kubenswrapper[4779]: I0320 15:23:48.151095 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:48 crc kubenswrapper[4779]: I0320 15:23:48.151886 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:48 crc kubenswrapper[4779]: I0320 15:23:48.153331 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:48 crc kubenswrapper[4779]: I0320 15:23:48.153438 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:48 crc kubenswrapper[4779]: I0320 15:23:48.153462 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:48 crc kubenswrapper[4779]: I0320 15:23:48.154854 4779 scope.go:117] "RemoveContainer" containerID="04e6d6fdb9955a260e69e338e4a4bb9be59f7c958fad14af868848340ea1b039" Mar 20 15:23:48 crc kubenswrapper[4779]: E0320 15:23:48.155287 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:23:48 crc kubenswrapper[4779]: I0320 15:23:48.755711 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:49 crc kubenswrapper[4779]: W0320 15:23:49.428731 4779 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 15:23:49 crc kubenswrapper[4779]: E0320 15:23:49.429371 4779 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 15:23:49 crc kubenswrapper[4779]: I0320 15:23:49.756421 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:49 crc kubenswrapper[4779]: I0320 15:23:49.847048 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:23:49 crc kubenswrapper[4779]: I0320 15:23:49.847273 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:49 crc kubenswrapper[4779]: I0320 15:23:49.848529 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:49 crc kubenswrapper[4779]: I0320 15:23:49.848555 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:49 crc kubenswrapper[4779]: I0320 15:23:49.848563 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:49 crc kubenswrapper[4779]: I0320 15:23:49.849153 4779 scope.go:117] "RemoveContainer" containerID="04e6d6fdb9955a260e69e338e4a4bb9be59f7c958fad14af868848340ea1b039" Mar 20 15:23:49 crc kubenswrapper[4779]: E0320 15:23:49.849340 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:23:50 crc kubenswrapper[4779]: I0320 15:23:50.755086 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:50 crc kubenswrapper[4779]: I0320 15:23:50.911991 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:23:50 crc kubenswrapper[4779]: I0320 15:23:50.912136 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:50 crc kubenswrapper[4779]: I0320 15:23:50.912997 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:50 crc kubenswrapper[4779]: I0320 15:23:50.913030 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:50 crc kubenswrapper[4779]: I0320 15:23:50.913039 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:51 crc kubenswrapper[4779]: I0320 15:23:51.755275 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:52 crc kubenswrapper[4779]: E0320 15:23:52.535475 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 15:23:52 crc kubenswrapper[4779]: I0320 15:23:52.541565 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:52 crc kubenswrapper[4779]: I0320 15:23:52.542754 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:52 crc kubenswrapper[4779]: I0320 15:23:52.542794 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:52 crc kubenswrapper[4779]: I0320 15:23:52.542806 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:52 crc kubenswrapper[4779]: I0320 15:23:52.542832 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:52 crc kubenswrapper[4779]: E0320 15:23:52.546554 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 15:23:52 crc kubenswrapper[4779]: I0320 15:23:52.755390 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:53 crc kubenswrapper[4779]: I0320 15:23:53.002798 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 15:23:53 crc kubenswrapper[4779]: I0320 15:23:53.002930 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:53 crc kubenswrapper[4779]: I0320 15:23:53.003835 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:53 crc kubenswrapper[4779]: I0320 15:23:53.003860 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:53 crc kubenswrapper[4779]: I0320 15:23:53.003868 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:53 crc kubenswrapper[4779]: I0320 15:23:53.755287 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:53 crc kubenswrapper[4779]: E0320 15:23:53.918904 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:23:54 crc kubenswrapper[4779]: I0320 15:23:54.756506 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:55 crc kubenswrapper[4779]: I0320 15:23:55.756092 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:56 crc kubenswrapper[4779]: I0320 15:23:56.758797 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:57 crc kubenswrapper[4779]: I0320 15:23:57.755142 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:58 crc kubenswrapper[4779]: I0320 15:23:58.755802 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:23:59 crc kubenswrapper[4779]: E0320 15:23:59.540060 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 15:23:59 crc kubenswrapper[4779]: I0320 15:23:59.547438 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:23:59 crc kubenswrapper[4779]: I0320 15:23:59.548459 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:23:59 crc kubenswrapper[4779]: I0320 15:23:59.548492 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:23:59 crc kubenswrapper[4779]: I0320 15:23:59.548500 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:23:59 crc kubenswrapper[4779]: I0320 15:23:59.548520 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:23:59 crc kubenswrapper[4779]: E0320 15:23:59.554001 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 15:23:59 crc kubenswrapper[4779]: I0320 15:23:59.756557 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:00 crc kubenswrapper[4779]: I0320 15:24:00.759185 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:01 crc kubenswrapper[4779]: I0320 15:24:01.756291 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:02 crc kubenswrapper[4779]: I0320 15:24:02.758126 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:03 crc kubenswrapper[4779]: I0320 15:24:03.756758 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:03 crc kubenswrapper[4779]: I0320 15:24:03.807835 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:03 crc kubenswrapper[4779]: I0320 15:24:03.809758 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:03 crc kubenswrapper[4779]: I0320 15:24:03.809815 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:03 crc kubenswrapper[4779]: I0320 15:24:03.809831 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:03 crc kubenswrapper[4779]: I0320 15:24:03.810957 4779 scope.go:117] "RemoveContainer" containerID="04e6d6fdb9955a260e69e338e4a4bb9be59f7c958fad14af868848340ea1b039" Mar 20 15:24:03 crc kubenswrapper[4779]: E0320 15:24:03.919636 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:24:04 crc kubenswrapper[4779]: I0320 15:24:04.039972 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 15:24:04 crc kubenswrapper[4779]: I0320 15:24:04.041344 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9"} Mar 20 15:24:04 crc kubenswrapper[4779]: I0320 15:24:04.041457 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:04 crc kubenswrapper[4779]: I0320 15:24:04.042156 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:04 crc kubenswrapper[4779]: I0320 15:24:04.042183 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:04 crc kubenswrapper[4779]: I0320 15:24:04.042193 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:04 crc kubenswrapper[4779]: I0320 15:24:04.756832 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:05 crc kubenswrapper[4779]: I0320 15:24:05.758208 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.046529 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.046896 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.048587 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" exitCode=255 Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.048619 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9"} Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.048651 4779 scope.go:117] "RemoveContainer" containerID="04e6d6fdb9955a260e69e338e4a4bb9be59f7c958fad14af868848340ea1b039" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.048782 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.049608 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.049633 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.049642 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.050039 4779 scope.go:117] "RemoveContainer" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" Mar 20 15:24:06 crc kubenswrapper[4779]: E0320 15:24:06.050190 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:24:06 crc kubenswrapper[4779]: E0320 15:24:06.544167 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.554489 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.555481 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.555511 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.555520 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.555541 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:24:06 crc kubenswrapper[4779]: E0320 15:24:06.558505 4779 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 15:24:06 crc kubenswrapper[4779]: I0320 15:24:06.755850 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:07 crc kubenswrapper[4779]: I0320 15:24:07.052732 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 15:24:07 crc kubenswrapper[4779]: I0320 15:24:07.756386 4779 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.124157 4779 csr.go:261] certificate signing request csr-l285s is approved, waiting to be issued Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.132790 4779 csr.go:257] certificate signing request csr-l285s is issued Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.151184 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.151351 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.152620 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.152679 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.152700 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.153688 4779 scope.go:117] "RemoveContainer" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" Mar 20 15:24:08 crc kubenswrapper[4779]: E0320 15:24:08.154036 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.209342 4779 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 15:24:08 crc kubenswrapper[4779]: I0320 15:24:08.582903 4779 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 15:24:09 crc kubenswrapper[4779]: I0320 15:24:09.134460 4779 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 12:22:55.300712002 +0000 UTC Mar 20 15:24:09 crc kubenswrapper[4779]: I0320 15:24:09.134527 4779 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6188h58m46.166191093s for next certificate rotation Mar 20 15:24:09 crc kubenswrapper[4779]: I0320 15:24:09.846434 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:24:09 crc kubenswrapper[4779]: I0320 15:24:09.846617 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:09 crc kubenswrapper[4779]: I0320 15:24:09.848078 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:09 crc kubenswrapper[4779]: I0320 15:24:09.848230 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:09 crc kubenswrapper[4779]: I0320 15:24:09.848722 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:09 crc kubenswrapper[4779]: I0320 15:24:09.850860 4779 scope.go:117] "RemoveContainer" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" Mar 20 15:24:09 crc kubenswrapper[4779]: E0320 15:24:09.851558 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.559584 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.561294 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.561330 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.561342 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.561440 4779 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.573240 4779 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.573518 4779 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.573541 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.577534 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.577562 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.577572 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.577585 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.577610 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:13Z","lastTransitionTime":"2026-03-20T15:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.589805 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.596825 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.596858 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.596866 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.596880 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.596888 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:13Z","lastTransitionTime":"2026-03-20T15:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.608673 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.617000 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.617031 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.617041 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.617055 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.617066 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:13Z","lastTransitionTime":"2026-03-20T15:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.627407 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.634086 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.634176 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.634189 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.634205 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.634217 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:13Z","lastTransitionTime":"2026-03-20T15:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.644577 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.644793 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.644823 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.745159 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.808727 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.810097 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.810287 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:13 crc kubenswrapper[4779]: I0320 15:24:13.810399 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.846166 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.919728 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:24:13 crc kubenswrapper[4779]: E0320 15:24:13.946989 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.047926 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.148274 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.249053 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.349839 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.450832 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: I0320 15:24:14.503094 4779 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.551515 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.652219 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.753467 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.854359 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:14 crc kubenswrapper[4779]: E0320 15:24:14.954610 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.055040 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.155264 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.256153 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.356863 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.457599 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.558133 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.658295 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.758642 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.859608 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:15 crc kubenswrapper[4779]: E0320 15:24:15.960678 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.061740 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.162802 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.263849 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.364893 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.466018 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.566890 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.667552 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.768175 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.868974 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:16 crc kubenswrapper[4779]: E0320 15:24:16.969392 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.070160 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.170847 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.271926 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.372947 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.474048 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.574785 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.675570 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.776666 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.877359 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:17 crc kubenswrapper[4779]: E0320 15:24:17.978369 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.078763 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.179612 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.280370 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.380616 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.480842 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.581880 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.682560 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.783247 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.884028 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:18 crc kubenswrapper[4779]: E0320 15:24:18.984910 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.086036 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.186472 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.287059 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.388033 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.488410 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.589298 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.690402 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.791175 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.891858 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:19 crc kubenswrapper[4779]: E0320 15:24:19.992795 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.093209 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.193299 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.294090 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.394347 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.495324 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.596317 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.696387 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.797813 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: I0320 15:24:20.808563 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:20 crc kubenswrapper[4779]: I0320 15:24:20.809872 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:20 crc kubenswrapper[4779]: I0320 15:24:20.809914 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:20 crc kubenswrapper[4779]: I0320 15:24:20.809925 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.899008 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:20 crc kubenswrapper[4779]: E0320 15:24:20.999847 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.100635 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.201612 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.302516 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.403690 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.504729 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.606073 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.707350 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.808232 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:21 crc kubenswrapper[4779]: E0320 15:24:21.908459 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.009015 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.109770 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.210468 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.311222 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.412073 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.513067 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.613590 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.714449 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.815388 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:22 crc kubenswrapper[4779]: E0320 15:24:22.934430 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.034609 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.135988 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.236886 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.337920 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.438012 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.539234 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.639949 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.740269 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.808190 4779 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.809687 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.809829 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.809909 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.810672 4779 scope.go:117] "RemoveContainer" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.810952 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.840938 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.866178 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.871741 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.871789 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.871809 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.871830 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.871845 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:23Z","lastTransitionTime":"2026-03-20T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.884173 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.888563 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.888603 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.888614 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.888636 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.888647 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:23Z","lastTransitionTime":"2026-03-20T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.900049 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.904463 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.904632 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.904729 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.904815 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.904904 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:23Z","lastTransitionTime":"2026-03-20T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.915637 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.919428 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.919462 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.919474 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.919490 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:23 crc kubenswrapper[4779]: I0320 15:24:23.919502 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:23Z","lastTransitionTime":"2026-03-20T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.920223 4779 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.935430 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.935540 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:24:23 crc kubenswrapper[4779]: E0320 15:24:23.941548 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.042310 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.143330 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.243440 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.344497 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.444903 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.545817 4779 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.559689 4779 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.647926 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.647956 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.647965 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.647978 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.647988 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:24Z","lastTransitionTime":"2026-03-20T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.749727 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.749785 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.749796 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.749811 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.749823 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:24Z","lastTransitionTime":"2026-03-20T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.770400 4779 apiserver.go:52] "Watching apiserver" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.778419 4779 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.778960 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lfj25","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-ovn-kubernetes/ovnkube-node-bl2w2","openshift-dns/node-resolver-9dnjh","openshift-image-registry/node-ca-l9j6r","openshift-multus/network-metrics-daemon-l4gtx","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-machine-config-operator/machine-config-daemon-fs4qg","openshift-multus/multus-additional-cni-plugins-clzkt","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw"] Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.779439 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.779752 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.779822 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.779814 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.786056 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.779872 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.786161 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.786240 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.786267 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.786309 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.786582 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.786652 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.786702 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.787052 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9dnjh" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.787364 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.787595 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.787663 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.788242 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.788321 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.788437 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.790099 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.792919 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.793381 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.793594 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.793887 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.794085 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.795221 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.795425 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.795704 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.795887 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.796234 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.796634 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.797051 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.798687 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.798797 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.799973 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800074 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800150 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800205 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800157 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800266 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800407 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800425 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800454 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800494 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800525 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800566 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800591 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800611 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800859 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.800872 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.801002 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.801022 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.801200 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.801538 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.818634 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.829604 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.839026 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.848571 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.851638 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.851682 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.851690 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.851702 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.851712 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:24Z","lastTransitionTime":"2026-03-20T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.854196 4779 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.858080 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.866838 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.872474 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.881626 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.889410 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.896311 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.916682 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.931080 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.945941 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948250 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948303 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948327 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948353 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948382 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948403 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948422 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948445 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948468 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948488 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948511 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948530 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948548 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948565 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948582 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948597 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948613 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948634 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948650 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948664 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948679 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948694 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948709 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948727 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948741 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948759 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948775 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948791 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948810 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948826 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948841 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948856 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948871 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948885 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948901 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948919 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948935 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948949 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948964 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.948979 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949020 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949037 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949052 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949067 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949081 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949096 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949135 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949153 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949190 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949205 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949219 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949234 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949248 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949264 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949279 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949294 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949312 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949327 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949344 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949360 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949374 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949389 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949405 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949421 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949437 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949453 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949470 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949486 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949505 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949521 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949538 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949552 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949568 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949584 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949600 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949615 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949631 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949647 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949663 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949678 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949895 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.949984 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950074 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950178 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950173 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950432 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950489 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950490 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950504 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950656 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950938 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950659 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950657 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950752 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950766 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950983 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950806 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950920 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950986 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.950933 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951148 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951210 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951231 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951242 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951336 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951408 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951511 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951948 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951961 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.951990 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952013 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952034 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952054 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952074 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952098 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952102 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952152 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952173 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952195 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952212 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952220 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952261 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952364 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952388 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952411 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952438 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952463 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952488 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952512 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952531 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952550 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952576 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952597 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952613 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952630 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952645 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952668 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952684 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952700 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952716 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952732 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952749 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952766 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952782 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952797 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952811 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952828 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952847 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952868 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952890 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952891 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952920 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.952933 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953030 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953271 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953328 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953306 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953388 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953425 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953439 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953454 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953485 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953463 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953536 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953555 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953604 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953627 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953669 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953779 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953792 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953821 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953901 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955467 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955714 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955773 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955867 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955931 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954051 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955937 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.953972 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954092 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954249 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954341 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954531 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954676 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954660 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954784 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954885 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954926 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.954941 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.955067 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:24:25.455043513 +0000 UTC m=+82.417559523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956076 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956129 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956158 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956182 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956204 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956224 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956244 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956253 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956267 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956278 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:24Z","lastTransitionTime":"2026-03-20T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956408 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956462 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956229 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956516 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956645 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956705 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956735 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956761 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956787 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956808 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956834 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956863 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956994 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957034 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957241 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957272 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957296 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957324 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957350 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957373 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957396 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957418 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957441 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957464 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957489 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957515 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957638 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957679 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957704 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957728 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957750 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957773 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957797 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957823 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957846 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957878 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957903 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957925 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957947 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957973 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957996 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958021 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958045 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958071 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958096 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958136 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958160 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958557 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958583 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958605 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958630 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958652 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958691 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958714 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958741 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958764 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958785 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958809 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958832 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958858 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958880 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958902 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958922 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958978 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959009 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-netns\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959163 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcxgv\" (UniqueName: \"kubernetes.io/projected/f27b5011-2d73-40e1-b508-a10e9c6f19a8-kube-api-access-kcxgv\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959194 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c30ee189-9db1-41af-8a55-29955cbf6712-cni-binary-copy\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959222 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-cni-multus\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959961 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46120a91-b00c-4299-b552-a374d2a78726-host\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960293 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960348 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960377 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960401 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n4l9\" (UniqueName: \"kubernetes.io/projected/c30ee189-9db1-41af-8a55-29955cbf6712-kube-api-access-9n4l9\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960423 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-netns\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960458 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960482 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-etc-kubernetes\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960621 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-systemd\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960656 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960681 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960706 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960729 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-cni-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960751 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-kubelet\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960775 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960801 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-system-cni-dir\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960823 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9addf988-9b4c-4e5e-a5fe-793bab35a52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960846 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-netd\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960865 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-script-lib\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960885 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-cni-bin\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960905 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-node-log\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960925 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-os-release\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960949 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7a20ef3-86de-4db2-b500-63af002500b4-cni-binary-copy\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960970 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovn-node-metrics-cert\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961022 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9addf988-9b4c-4e5e-a5fe-793bab35a52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961050 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961326 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-kubelet\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961355 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-var-lib-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961377 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-k8s-cni-cncf-io\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961399 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-conf-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961421 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-slash\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961447 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961468 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5293026d-bcf7-4270-8ad8-59a90e70ab1a-hosts-file\") pod \"node-resolver-9dnjh\" (UID: \"5293026d-bcf7-4270-8ad8-59a90e70ab1a\") " pod="openshift-dns/node-resolver-9dnjh" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961491 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-etc-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961512 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-hostroot\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961535 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/451fc579-db57-4b36-a775-6d2986de3efc-mcd-auth-proxy-config\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961554 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c30ee189-9db1-41af-8a55-29955cbf6712-multus-daemon-config\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962715 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46120a91-b00c-4299-b552-a374d2a78726-serviceca\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962777 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9addf988-9b4c-4e5e-a5fe-793bab35a52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962801 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxnrk\" (UniqueName: \"kubernetes.io/projected/9addf988-9b4c-4e5e-a5fe-793bab35a52f-kube-api-access-cxnrk\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962827 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962854 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962881 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-multus-certs\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962905 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkfvw\" (UniqueName: \"kubernetes.io/projected/46120a91-b00c-4299-b552-a374d2a78726-kube-api-access-qkfvw\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962926 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-ovn\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962922 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962948 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnnk6\" (UniqueName: \"kubernetes.io/projected/e7a20ef3-86de-4db2-b500-63af002500b4-kube-api-access-rnnk6\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963119 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/451fc579-db57-4b36-a775-6d2986de3efc-rootfs\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963150 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/451fc579-db57-4b36-a775-6d2986de3efc-proxy-tls\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956805 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963207 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956808 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955151 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955166 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955316 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955334 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.956821 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957122 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.955140 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957284 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957306 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957327 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957461 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957476 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957467 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957508 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957529 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957533 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957810 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957777 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957901 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.957970 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958220 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958493 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958516 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958573 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958758 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958658 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958799 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.958967 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959154 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959526 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959583 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959599 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959607 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959727 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.959960 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960020 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960129 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960189 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.960397 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961207 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961497 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961610 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961641 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.961958 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962124 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962144 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962242 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962276 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.962340 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963085 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963365 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963531 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963378 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963608 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963388 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963511 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963395 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963511 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963754 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l595l\" (UniqueName: \"kubernetes.io/projected/5293026d-bcf7-4270-8ad8-59a90e70ab1a-kube-api-access-l595l\") pod \"node-resolver-9dnjh\" (UID: \"5293026d-bcf7-4270-8ad8-59a90e70ab1a\") " pod="openshift-dns/node-resolver-9dnjh" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963794 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blhdn\" (UniqueName: \"kubernetes.io/projected/44bdd151-2a1e-4f14-a095-81b541307138-kube-api-access-blhdn\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.963819 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-systemd-units\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964146 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-config\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964178 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkxk2\" (UniqueName: \"kubernetes.io/projected/451fc579-db57-4b36-a775-6d2986de3efc-kube-api-access-lkxk2\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964203 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964229 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964252 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964276 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e7a20ef3-86de-4db2-b500-63af002500b4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964307 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964330 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-cnibin\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964356 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-os-release\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964372 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-env-overrides\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964396 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-cnibin\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964416 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-log-socket\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964455 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-bin\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964481 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964508 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964529 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-system-cni-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964554 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-socket-dir-parent\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964879 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.964986 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.965664 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.965689 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.965716 4779 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.965968 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.966741 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.967137 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.967202 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.967453 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.969318 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.969393 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.969431 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.970325 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.970810 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.970960 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.975323 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.975458 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.975645 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.977143 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.977671 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.977930 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.978766 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.979466 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.979509 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.979533 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.979548 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.979644 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.979771 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.979666 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.979966 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.980213 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.980252 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.980486 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.980545 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:24 crc kubenswrapper[4779]: E0320 15:24:24.982260 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:25.48223864 +0000 UTC m=+82.444754440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982385 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982628 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982876 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.980961 4779 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982916 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982932 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982950 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982962 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982975 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982985 4779 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.982994 4779 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983003 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983012 4779 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983021 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983029 4779 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983038 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983037 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.980555 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.980698 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.980713 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.980730 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.981009 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.981012 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.981082 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.981138 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.981217 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.981972 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983274 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983351 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983470 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983047 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983507 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983554 4779 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983569 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983583 4779 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983595 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983610 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983623 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983635 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983648 4779 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983662 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983680 4779 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983693 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983705 4779 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983718 4779 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983730 4779 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983742 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983760 4779 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983773 4779 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983797 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.983786 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984186 4779 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984207 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984220 4779 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984234 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984247 4779 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984259 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984271 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984283 4779 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984294 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984305 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984322 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984334 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984346 4779 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984357 4779 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984369 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984382 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984400 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984412 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984425 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984441 4779 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984453 4779 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984464 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984475 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984562 4779 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984598 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984611 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984623 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984635 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984646 4779 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984658 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984669 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984680 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984696 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984712 4779 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984724 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984736 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984748 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984760 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984772 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984784 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984795 4779 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984807 4779 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984822 4779 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984833 4779 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984846 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984857 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984869 4779 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984881 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984883 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984895 4779 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984907 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984923 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:24 crc kubenswrapper[4779]: I0320 15:24:24.984935 4779 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.981049 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985022 4779 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985040 4779 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985051 4779 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985062 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985074 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985084 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985095 4779 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985139 4779 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985177 4779 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985207 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985219 4779 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985232 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985243 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985255 4779 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985273 4779 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985285 4779 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985296 4779 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985307 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985305 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985319 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985331 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985346 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985361 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985379 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985392 4779 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985403 4779 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985420 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985471 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985542 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985568 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985594 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985732 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985775 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985829 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:24.985882 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.985968 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.986245 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.986270 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:24.987190 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:25.487170567 +0000 UTC m=+82.449686367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.987247 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.987263 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.987274 4779 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.987296 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.988382 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:24.988534 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:24.988555 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:24.988568 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:24.988617 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:25.488605721 +0000 UTC m=+82.451121521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.989226 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.989474 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.990509 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.994350 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.997277 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.997641 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:24.999207 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.001330 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.005518 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.005544 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.005555 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.005608 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:25.505590324 +0000 UTC m=+82.468106114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.008945 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.014002 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.017296 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.025572 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.025650 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.058801 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.058963 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.059019 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.059129 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.059188 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088161 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-socket-dir-parent\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088196 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-log-socket\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088216 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-bin\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088232 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-system-cni-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088247 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-netns\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088260 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcxgv\" (UniqueName: \"kubernetes.io/projected/f27b5011-2d73-40e1-b508-a10e9c6f19a8-kube-api-access-kcxgv\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088275 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c30ee189-9db1-41af-8a55-29955cbf6712-cni-binary-copy\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088288 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-cni-multus\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088281 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-socket-dir-parent\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088330 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46120a91-b00c-4299-b552-a374d2a78726-host\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088301 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46120a91-b00c-4299-b552-a374d2a78726-host\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088361 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-log-socket\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088382 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-bin\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088390 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088418 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088447 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088464 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n4l9\" (UniqueName: \"kubernetes.io/projected/c30ee189-9db1-41af-8a55-29955cbf6712-kube-api-access-9n4l9\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088484 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-system-cni-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088496 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-netns\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088505 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-netns\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088513 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-kubelet\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088532 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-etc-kubernetes\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088551 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-systemd\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088580 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-cni-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088597 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088613 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-system-cni-dir\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088630 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9addf988-9b4c-4e5e-a5fe-793bab35a52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088646 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-netd\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088663 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-script-lib\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088682 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-cni-bin\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088712 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-node-log\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088728 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-os-release\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088744 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7a20ef3-86de-4db2-b500-63af002500b4-cni-binary-copy\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088761 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-var-lib-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088776 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovn-node-metrics-cert\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088792 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9addf988-9b4c-4e5e-a5fe-793bab35a52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088810 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-kubelet\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088824 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-k8s-cni-cncf-io\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088841 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-conf-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088856 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-slash\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088880 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5293026d-bcf7-4270-8ad8-59a90e70ab1a-hosts-file\") pod \"node-resolver-9dnjh\" (UID: \"5293026d-bcf7-4270-8ad8-59a90e70ab1a\") " pod="openshift-dns/node-resolver-9dnjh" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088876 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-cni-multus\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088895 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-etc-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088926 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-etc-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088944 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-hostroot\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088955 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-netd\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.088976 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/451fc579-db57-4b36-a775-6d2986de3efc-mcd-auth-proxy-config\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089002 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c30ee189-9db1-41af-8a55-29955cbf6712-multus-daemon-config\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089025 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46120a91-b00c-4299-b552-a374d2a78726-serviceca\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089048 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnnk6\" (UniqueName: \"kubernetes.io/projected/e7a20ef3-86de-4db2-b500-63af002500b4-kube-api-access-rnnk6\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089076 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9addf988-9b4c-4e5e-a5fe-793bab35a52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089099 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxnrk\" (UniqueName: \"kubernetes.io/projected/9addf988-9b4c-4e5e-a5fe-793bab35a52f-kube-api-access-cxnrk\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089186 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-multus-certs\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089208 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkfvw\" (UniqueName: \"kubernetes.io/projected/46120a91-b00c-4299-b552-a374d2a78726-kube-api-access-qkfvw\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089230 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-ovn\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089337 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-config\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089366 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/451fc579-db57-4b36-a775-6d2986de3efc-rootfs\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089387 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/451fc579-db57-4b36-a775-6d2986de3efc-proxy-tls\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089409 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l595l\" (UniqueName: \"kubernetes.io/projected/5293026d-bcf7-4270-8ad8-59a90e70ab1a-kube-api-access-l595l\") pod \"node-resolver-9dnjh\" (UID: \"5293026d-bcf7-4270-8ad8-59a90e70ab1a\") " pod="openshift-dns/node-resolver-9dnjh" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089427 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blhdn\" (UniqueName: \"kubernetes.io/projected/44bdd151-2a1e-4f14-a095-81b541307138-kube-api-access-blhdn\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089446 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-systemd-units\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089465 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e7a20ef3-86de-4db2-b500-63af002500b4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089485 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkxk2\" (UniqueName: \"kubernetes.io/projected/451fc579-db57-4b36-a775-6d2986de3efc-kube-api-access-lkxk2\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089504 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089527 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089550 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089571 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-cnibin\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089602 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-cnibin\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089623 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-os-release\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089626 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c30ee189-9db1-41af-8a55-29955cbf6712-cni-binary-copy\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089645 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-env-overrides\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089716 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-kubelet\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089771 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-os-release\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089798 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-netns\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089827 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c30ee189-9db1-41af-8a55-29955cbf6712-multus-daemon-config\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089839 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-script-lib\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089883 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-conf-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089913 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-slash\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090005 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-cni-bin\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090023 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5293026d-bcf7-4270-8ad8-59a90e70ab1a-hosts-file\") pod \"node-resolver-9dnjh\" (UID: \"5293026d-bcf7-4270-8ad8-59a90e70ab1a\") " pod="openshift-dns/node-resolver-9dnjh" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.089046 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-hostroot\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090183 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090229 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090170 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-systemd\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090228 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-ovn\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.090340 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.090377 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:25.590365891 +0000 UTC m=+82.552881681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090449 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-etc-kubernetes\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090468 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-os-release\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090479 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-node-log\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090499 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-var-lib-kubelet\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090527 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-system-cni-dir\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090559 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090578 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-systemd-units\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090592 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-k8s-cni-cncf-io\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090609 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-cnibin\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090632 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-cnibin\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090753 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-multus-cni-dir\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.090783 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-var-lib-openvswitch\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091155 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091168 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/451fc579-db57-4b36-a775-6d2986de3efc-rootfs\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091196 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091223 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c30ee189-9db1-41af-8a55-29955cbf6712-host-run-multus-certs\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091477 4779 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091508 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091523 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091538 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091551 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091563 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091576 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091486 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9addf988-9b4c-4e5e-a5fe-793bab35a52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091590 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091603 4779 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091615 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091627 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091640 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091653 4779 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091665 4779 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091676 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091688 4779 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091700 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091712 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091723 4779 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091734 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091754 4779 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091766 4779 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091775 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7a20ef3-86de-4db2-b500-63af002500b4-cni-binary-copy\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091835 4779 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091851 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091864 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091875 4779 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091887 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091898 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091909 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.091920 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092014 4779 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092026 4779 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092030 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/451fc579-db57-4b36-a775-6d2986de3efc-mcd-auth-proxy-config\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092036 4779 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092071 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092081 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092091 4779 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092101 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092133 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092144 4779 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092153 4779 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092162 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092171 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092180 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092189 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092198 4779 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092207 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092216 4779 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092170 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7a20ef3-86de-4db2-b500-63af002500b4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092224 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092232 4779 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092241 4779 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092249 4779 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092258 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092266 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092274 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092284 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092293 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092302 4779 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092310 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092210 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e7a20ef3-86de-4db2-b500-63af002500b4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092319 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092355 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.092996 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093033 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093059 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093069 4779 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093077 4779 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093090 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093188 4779 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093208 4779 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093299 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093317 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.093328 4779 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094277 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094303 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094317 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094332 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094344 4779 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094356 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094369 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094380 4779 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094392 4779 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.094860 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-env-overrides\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.095008 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46120a91-b00c-4299-b552-a374d2a78726-serviceca\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.095432 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-config\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.095767 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9addf988-9b4c-4e5e-a5fe-793bab35a52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.096065 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovn-node-metrics-cert\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.096100 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/451fc579-db57-4b36-a775-6d2986de3efc-proxy-tls\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.101500 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9addf988-9b4c-4e5e-a5fe-793bab35a52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.107089 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n4l9\" (UniqueName: \"kubernetes.io/projected/c30ee189-9db1-41af-8a55-29955cbf6712-kube-api-access-9n4l9\") pod \"multus-lfj25\" (UID: \"c30ee189-9db1-41af-8a55-29955cbf6712\") " pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.107467 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcxgv\" (UniqueName: \"kubernetes.io/projected/f27b5011-2d73-40e1-b508-a10e9c6f19a8-kube-api-access-kcxgv\") pod \"ovnkube-node-bl2w2\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.108409 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.108480 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l595l\" (UniqueName: \"kubernetes.io/projected/5293026d-bcf7-4270-8ad8-59a90e70ab1a-kube-api-access-l595l\") pod \"node-resolver-9dnjh\" (UID: \"5293026d-bcf7-4270-8ad8-59a90e70ab1a\") " pod="openshift-dns/node-resolver-9dnjh" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.108628 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnnk6\" (UniqueName: \"kubernetes.io/projected/e7a20ef3-86de-4db2-b500-63af002500b4-kube-api-access-rnnk6\") pod \"multus-additional-cni-plugins-clzkt\" (UID: \"e7a20ef3-86de-4db2-b500-63af002500b4\") " pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.113275 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxnrk\" (UniqueName: \"kubernetes.io/projected/9addf988-9b4c-4e5e-a5fe-793bab35a52f-kube-api-access-cxnrk\") pod \"ovnkube-control-plane-749d76644c-92wlw\" (UID: \"9addf988-9b4c-4e5e-a5fe-793bab35a52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.114664 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.115102 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blhdn\" (UniqueName: \"kubernetes.io/projected/44bdd151-2a1e-4f14-a095-81b541307138-kube-api-access-blhdn\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.115473 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkxk2\" (UniqueName: \"kubernetes.io/projected/451fc579-db57-4b36-a775-6d2986de3efc-kube-api-access-lkxk2\") pod \"machine-config-daemon-fs4qg\" (UID: \"451fc579-db57-4b36-a775-6d2986de3efc\") " pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.116271 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkfvw\" (UniqueName: \"kubernetes.io/projected/46120a91-b00c-4299-b552-a374d2a78726-kube-api-access-qkfvw\") pod \"node-ca-l9j6r\" (UID: \"46120a91-b00c-4299-b552-a374d2a78726\") " pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.120909 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lfj25" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.123148 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 15:24:25 crc kubenswrapper[4779]: set -o allexport Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: source /etc/kubernetes/apiserver-url.env Mar 20 15:24:25 crc kubenswrapper[4779]: else Mar 20 15:24:25 crc kubenswrapper[4779]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 15:24:25 crc kubenswrapper[4779]: exit 1 Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 15:24:25 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.125046 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.127801 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l9j6r" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.135353 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.137484 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ -f "/env/_master" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: set -o allexport Mar 20 15:24:25 crc kubenswrapper[4779]: source "/env/_master" Mar 20 15:24:25 crc kubenswrapper[4779]: set +o allexport Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 15:24:25 crc kubenswrapper[4779]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 15:24:25 crc kubenswrapper[4779]: ho_enable="--enable-hybrid-overlay" Mar 20 15:24:25 crc kubenswrapper[4779]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 15:24:25 crc kubenswrapper[4779]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 15:24:25 crc kubenswrapper[4779]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 15:24:25 crc kubenswrapper[4779]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 15:24:25 crc kubenswrapper[4779]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 15:24:25 crc kubenswrapper[4779]: --webhook-host=127.0.0.1 \ Mar 20 15:24:25 crc kubenswrapper[4779]: --webhook-port=9743 \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${ho_enable} \ Mar 20 15:24:25 crc kubenswrapper[4779]: --enable-interconnect \ Mar 20 15:24:25 crc kubenswrapper[4779]: --disable-approver \ Mar 20 15:24:25 crc kubenswrapper[4779]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 15:24:25 crc kubenswrapper[4779]: --wait-for-kubernetes-api=200s \ Mar 20 15:24:25 crc kubenswrapper[4779]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 15:24:25 crc kubenswrapper[4779]: --loglevel="${LOGLEVEL}" Mar 20 15:24:25 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: W0320 15:24:25.137778 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30ee189_9db1_41af_8a55_29955cbf6712.slice/crio-d579a8df0a98986cc50349bc0493a786ae5580e6adc314fb0a3539c99ebc9c07 WatchSource:0}: Error finding container d579a8df0a98986cc50349bc0493a786ae5580e6adc314fb0a3539c99ebc9c07: Status 404 returned error can't find the container with id d579a8df0a98986cc50349bc0493a786ae5580e6adc314fb0a3539c99ebc9c07 Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.140752 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 15:24:25 crc kubenswrapper[4779]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 15:24:25 crc kubenswrapper[4779]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n4l9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-lfj25_openshift-multus(c30ee189-9db1-41af-8a55-29955cbf6712): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.142418 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-lfj25" podUID="c30ee189-9db1-41af-8a55-29955cbf6712" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.142508 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9dnjh" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.142522 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ -f "/env/_master" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: set -o allexport Mar 20 15:24:25 crc kubenswrapper[4779]: source "/env/_master" Mar 20 15:24:25 crc kubenswrapper[4779]: set +o allexport Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 15:24:25 crc kubenswrapper[4779]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 15:24:25 crc kubenswrapper[4779]: --disable-webhook \ Mar 20 15:24:25 crc kubenswrapper[4779]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 15:24:25 crc kubenswrapper[4779]: --loglevel="${LOGLEVEL}" Mar 20 15:24:25 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.143585 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 15:24:25 crc kubenswrapper[4779]: W0320 15:24:25.146267 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-37d6d5f595e43ae4f92dcd1e9a6303406def6bf9af1dc81184a131f58d49bc70 WatchSource:0}: Error finding container 37d6d5f595e43ae4f92dcd1e9a6303406def6bf9af1dc81184a131f58d49bc70: Status 404 returned error can't find the container with id 37d6d5f595e43ae4f92dcd1e9a6303406def6bf9af1dc81184a131f58d49bc70 Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.148722 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.153046 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.154179 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 15:24:25 crc kubenswrapper[4779]: while [ true ]; Mar 20 15:24:25 crc kubenswrapper[4779]: do Mar 20 15:24:25 crc kubenswrapper[4779]: for f in $(ls /tmp/serviceca); do Mar 20 15:24:25 crc kubenswrapper[4779]: echo $f Mar 20 15:24:25 crc kubenswrapper[4779]: ca_file_path="/tmp/serviceca/${f}" Mar 20 15:24:25 crc kubenswrapper[4779]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 15:24:25 crc kubenswrapper[4779]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 15:24:25 crc kubenswrapper[4779]: if [ -e "${reg_dir_path}" ]; then Mar 20 15:24:25 crc kubenswrapper[4779]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 15:24:25 crc kubenswrapper[4779]: else Mar 20 15:24:25 crc kubenswrapper[4779]: mkdir $reg_dir_path Mar 20 15:24:25 crc kubenswrapper[4779]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: for d in $(ls /etc/docker/certs.d); do Mar 20 15:24:25 crc kubenswrapper[4779]: echo $d Mar 20 15:24:25 crc kubenswrapper[4779]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 15:24:25 crc kubenswrapper[4779]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 15:24:25 crc kubenswrapper[4779]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 15:24:25 crc kubenswrapper[4779]: rm -rf /etc/docker/certs.d/$d Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: sleep 60 & wait ${!} Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkfvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-l9j6r_openshift-image-registry(46120a91-b00c-4299-b552-a374d2a78726): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: W0320 15:24:25.154889 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5293026d_bcf7_4270_8ad8_59a90e70ab1a.slice/crio-27fe22b7f2e63b8f004629562bd929e188eba50cf51b105100cb6efa6f6f67fc WatchSource:0}: Error finding container 27fe22b7f2e63b8f004629562bd929e188eba50cf51b105100cb6efa6f6f67fc: Status 404 returned error can't find the container with id 27fe22b7f2e63b8f004629562bd929e188eba50cf51b105100cb6efa6f6f67fc Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.155028 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.155649 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-l9j6r" podUID="46120a91-b00c-4299-b552-a374d2a78726" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.156315 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.158524 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 15:24:25 crc kubenswrapper[4779]: set -uo pipefail Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 15:24:25 crc kubenswrapper[4779]: HOSTS_FILE="/etc/hosts" Mar 20 15:24:25 crc kubenswrapper[4779]: TEMP_FILE="/etc/hosts.tmp" Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: # Make a temporary file with the old hosts file's attributes. Mar 20 15:24:25 crc kubenswrapper[4779]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 15:24:25 crc kubenswrapper[4779]: echo "Failed to preserve hosts file. Exiting." Mar 20 15:24:25 crc kubenswrapper[4779]: exit 1 Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: while true; do Mar 20 15:24:25 crc kubenswrapper[4779]: declare -A svc_ips Mar 20 15:24:25 crc kubenswrapper[4779]: for svc in "${services[@]}"; do Mar 20 15:24:25 crc kubenswrapper[4779]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 15:24:25 crc kubenswrapper[4779]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 15:24:25 crc kubenswrapper[4779]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 15:24:25 crc kubenswrapper[4779]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 15:24:25 crc kubenswrapper[4779]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 15:24:25 crc kubenswrapper[4779]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 15:24:25 crc kubenswrapper[4779]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 15:24:25 crc kubenswrapper[4779]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 15:24:25 crc kubenswrapper[4779]: for i in ${!cmds[*]} Mar 20 15:24:25 crc kubenswrapper[4779]: do Mar 20 15:24:25 crc kubenswrapper[4779]: ips=($(eval "${cmds[i]}")) Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: svc_ips["${svc}"]="${ips[@]}" Mar 20 15:24:25 crc kubenswrapper[4779]: break Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: # Update /etc/hosts only if we get valid service IPs Mar 20 15:24:25 crc kubenswrapper[4779]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 15:24:25 crc kubenswrapper[4779]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 15:24:25 crc kubenswrapper[4779]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 15:24:25 crc kubenswrapper[4779]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 15:24:25 crc kubenswrapper[4779]: sleep 60 & wait Mar 20 15:24:25 crc kubenswrapper[4779]: continue Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: # Append resolver entries for services Mar 20 15:24:25 crc kubenswrapper[4779]: rc=0 Mar 20 15:24:25 crc kubenswrapper[4779]: for svc in "${!svc_ips[@]}"; do Mar 20 15:24:25 crc kubenswrapper[4779]: for ip in ${svc_ips[${svc}]}; do Mar 20 15:24:25 crc kubenswrapper[4779]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ $rc -ne 0 ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: sleep 60 & wait Mar 20 15:24:25 crc kubenswrapper[4779]: continue Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 15:24:25 crc kubenswrapper[4779]: # Replace /etc/hosts with our modified version if needed Mar 20 15:24:25 crc kubenswrapper[4779]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 15:24:25 crc kubenswrapper[4779]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: sleep 60 & wait Mar 20 15:24:25 crc kubenswrapper[4779]: unset svc_ips Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l595l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9dnjh_openshift-dns(5293026d-bcf7-4270-8ad8-59a90e70ab1a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.159486 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-clzkt" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.159746 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9dnjh" podUID="5293026d-bcf7-4270-8ad8-59a90e70ab1a" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.160642 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.160669 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.160680 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.160696 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.160707 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: W0320 15:24:25.163687 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod451fc579_db57_4b36_a775_6d2986de3efc.slice/crio-ea94b384664193d42db5252b285114925907da2bdee65fc77f8c177a8966a537 WatchSource:0}: Error finding container ea94b384664193d42db5252b285114925907da2bdee65fc77f8c177a8966a537: Status 404 returned error can't find the container with id ea94b384664193d42db5252b285114925907da2bdee65fc77f8c177a8966a537 Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.166659 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.169570 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 15:24:25 crc kubenswrapper[4779]: apiVersion: v1 Mar 20 15:24:25 crc kubenswrapper[4779]: clusters: Mar 20 15:24:25 crc kubenswrapper[4779]: - cluster: Mar 20 15:24:25 crc kubenswrapper[4779]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 15:24:25 crc kubenswrapper[4779]: server: https://api-int.crc.testing:6443 Mar 20 15:24:25 crc kubenswrapper[4779]: name: default-cluster Mar 20 15:24:25 crc kubenswrapper[4779]: contexts: Mar 20 15:24:25 crc kubenswrapper[4779]: - context: Mar 20 15:24:25 crc kubenswrapper[4779]: cluster: default-cluster Mar 20 15:24:25 crc kubenswrapper[4779]: namespace: default Mar 20 15:24:25 crc kubenswrapper[4779]: user: default-auth Mar 20 15:24:25 crc kubenswrapper[4779]: name: default-context Mar 20 15:24:25 crc kubenswrapper[4779]: current-context: default-context Mar 20 15:24:25 crc kubenswrapper[4779]: kind: Config Mar 20 15:24:25 crc kubenswrapper[4779]: preferences: {} Mar 20 15:24:25 crc kubenswrapper[4779]: users: Mar 20 15:24:25 crc kubenswrapper[4779]: - name: default-auth Mar 20 15:24:25 crc kubenswrapper[4779]: user: Mar 20 15:24:25 crc kubenswrapper[4779]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 15:24:25 crc kubenswrapper[4779]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 15:24:25 crc kubenswrapper[4779]: EOF Mar 20 15:24:25 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcxgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.170770 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.172486 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lkxk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.174795 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lkxk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: W0320 15:24:25.175386 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7a20ef3_86de_4db2_b500_63af002500b4.slice/crio-4355f2a998af3d5d89916df729e21b6d54fd4f0dadb81c8bac3bf477ce17bc8d WatchSource:0}: Error finding container 4355f2a998af3d5d89916df729e21b6d54fd4f0dadb81c8bac3bf477ce17bc8d: Status 404 returned error can't find the container with id 4355f2a998af3d5d89916df729e21b6d54fd4f0dadb81c8bac3bf477ce17bc8d Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.175901 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.179724 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rnnk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-clzkt_openshift-multus(e7a20ef3-86de-4db2-b500-63af002500b4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.180866 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-clzkt" podUID="e7a20ef3-86de-4db2-b500-63af002500b4" Mar 20 15:24:25 crc kubenswrapper[4779]: W0320 15:24:25.181063 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9addf988_9b4c_4e5e_a5fe_793bab35a52f.slice/crio-e9c8d26f5a0b9d822cbaddb9571b9589ebe0d69788ca9facdb01b1f008f48816 WatchSource:0}: Error finding container e9c8d26f5a0b9d822cbaddb9571b9589ebe0d69788ca9facdb01b1f008f48816: Status 404 returned error can't find the container with id e9c8d26f5a0b9d822cbaddb9571b9589ebe0d69788ca9facdb01b1f008f48816 Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.184460 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 15:24:25 crc kubenswrapper[4779]: set -euo pipefail Mar 20 15:24:25 crc kubenswrapper[4779]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 15:24:25 crc kubenswrapper[4779]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 15:24:25 crc kubenswrapper[4779]: # As the secret mount is optional we must wait for the files to be present. Mar 20 15:24:25 crc kubenswrapper[4779]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 15:24:25 crc kubenswrapper[4779]: TS=$(date +%s) Mar 20 15:24:25 crc kubenswrapper[4779]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 15:24:25 crc kubenswrapper[4779]: HAS_LOGGED_INFO=0 Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: log_missing_certs(){ Mar 20 15:24:25 crc kubenswrapper[4779]: CUR_TS=$(date +%s) Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 15:24:25 crc kubenswrapper[4779]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 15:24:25 crc kubenswrapper[4779]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 15:24:25 crc kubenswrapper[4779]: HAS_LOGGED_INFO=1 Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: } Mar 20 15:24:25 crc kubenswrapper[4779]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 15:24:25 crc kubenswrapper[4779]: log_missing_certs Mar 20 15:24:25 crc kubenswrapper[4779]: sleep 5 Mar 20 15:24:25 crc kubenswrapper[4779]: done Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 15:24:25 crc kubenswrapper[4779]: exec /usr/bin/kube-rbac-proxy \ Mar 20 15:24:25 crc kubenswrapper[4779]: --logtostderr \ Mar 20 15:24:25 crc kubenswrapper[4779]: --secure-listen-address=:9108 \ Mar 20 15:24:25 crc kubenswrapper[4779]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 15:24:25 crc kubenswrapper[4779]: --upstream=http://127.0.0.1:29108/ \ Mar 20 15:24:25 crc kubenswrapper[4779]: --tls-private-key-file=${TLS_PK} \ Mar 20 15:24:25 crc kubenswrapper[4779]: --tls-cert-file=${TLS_CERT} Mar 20 15:24:25 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxnrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-92wlw_openshift-ovn-kubernetes(9addf988-9b4c-4e5e-a5fe-793bab35a52f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.186586 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:25 crc kubenswrapper[4779]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ -f "/env/_master" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: set -o allexport Mar 20 15:24:25 crc kubenswrapper[4779]: source "/env/_master" Mar 20 15:24:25 crc kubenswrapper[4779]: set +o allexport Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: ovn_v4_join_subnet_opt= Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "" != "" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: ovn_v6_join_subnet_opt= Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "" != "" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: ovn_v4_transit_switch_subnet_opt= Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "" != "" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: ovn_v6_transit_switch_subnet_opt= Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "" != "" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: dns_name_resolver_enabled_flag= Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "false" == "true" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: persistent_ips_enabled_flag= Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "true" == "true" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: # This is needed so that converting clusters from GA to TP Mar 20 15:24:25 crc kubenswrapper[4779]: # will rollout control plane pods as well Mar 20 15:24:25 crc kubenswrapper[4779]: network_segmentation_enabled_flag= Mar 20 15:24:25 crc kubenswrapper[4779]: multi_network_enabled_flag= Mar 20 15:24:25 crc kubenswrapper[4779]: if [[ "true" == "true" ]]; then Mar 20 15:24:25 crc kubenswrapper[4779]: multi_network_enabled_flag="--enable-multi-network" Mar 20 15:24:25 crc kubenswrapper[4779]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 15:24:25 crc kubenswrapper[4779]: fi Mar 20 15:24:25 crc kubenswrapper[4779]: Mar 20 15:24:25 crc kubenswrapper[4779]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 15:24:25 crc kubenswrapper[4779]: exec /usr/bin/ovnkube \ Mar 20 15:24:25 crc kubenswrapper[4779]: --enable-interconnect \ Mar 20 15:24:25 crc kubenswrapper[4779]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 15:24:25 crc kubenswrapper[4779]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 15:24:25 crc kubenswrapper[4779]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 15:24:25 crc kubenswrapper[4779]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 15:24:25 crc kubenswrapper[4779]: --metrics-enable-pprof \ Mar 20 15:24:25 crc kubenswrapper[4779]: --metrics-enable-config-duration \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${ovn_v4_join_subnet_opt} \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${ovn_v6_join_subnet_opt} \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${dns_name_resolver_enabled_flag} \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${persistent_ips_enabled_flag} \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${multi_network_enabled_flag} \ Mar 20 15:24:25 crc kubenswrapper[4779]: ${network_segmentation_enabled_flag} Mar 20 15:24:25 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxnrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-92wlw_openshift-ovn-kubernetes(9addf988-9b4c-4e5e-a5fe-793bab35a52f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:25 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.187778 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" podUID="9addf988-9b4c-4e5e-a5fe-793bab35a52f" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.262624 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.262660 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.262668 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.262682 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.262692 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.364956 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.365024 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.365075 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.365098 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.365134 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.467422 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.467454 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.467462 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.467474 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.467484 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.498211 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498315 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:24:26.498294541 +0000 UTC m=+83.460810351 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.498347 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.498393 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.498432 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498486 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498510 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498528 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:26.498519257 +0000 UTC m=+83.461035057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498525 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498547 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:26.498537027 +0000 UTC m=+83.461052837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498551 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498566 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.498606 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:26.498595489 +0000 UTC m=+83.461111289 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.570588 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.570649 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.570666 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.570690 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.570707 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.598927 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.599011 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.599088 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.599170 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.599192 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.599133 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.599260 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:26.599237921 +0000 UTC m=+83.561753751 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:25 crc kubenswrapper[4779]: E0320 15:24:25.599286 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:26.599273113 +0000 UTC m=+83.561788943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.673432 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.673483 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.673496 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.673511 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.673524 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.776378 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.776440 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.776456 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.776474 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.776485 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.812022 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.812748 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.814057 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.814814 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.815971 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.816465 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.817182 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.818238 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.818909 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.819944 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.820479 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.821688 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.822253 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.822794 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.823744 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.824303 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.825301 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.825723 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.826311 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.827421 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.827904 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.828924 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.829372 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.830697 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.831175 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.831817 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.832750 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.833247 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.834130 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.834607 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.835423 4779 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.835517 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.837127 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.837932 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.838498 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.840100 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.840957 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.841871 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.842497 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.843468 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.843937 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.844908 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.845535 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.846500 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.846929 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.848063 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.849070 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.850960 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.851724 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.853011 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.853709 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.855160 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.856257 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.857002 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.879276 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.879317 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.879329 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.879348 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.879364 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.982612 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.982646 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.982657 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.982673 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:25 crc kubenswrapper[4779]: I0320 15:24:25.982685 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:25Z","lastTransitionTime":"2026-03-20T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.084099 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.084156 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.084176 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.084193 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.084205 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.106932 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l9j6r" event={"ID":"46120a91-b00c-4299-b552-a374d2a78726","Type":"ContainerStarted","Data":"dfbca9b7f7dcf9b92862593e4e4050a68927b2da82f56a05ff2acb93665d1542"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.107627 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fd7d3bfa90df2193c6b39864084794020ba2e988baeda3055ef92406b2606a70"} Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.108614 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 15:24:26 crc kubenswrapper[4779]: while [ true ]; Mar 20 15:24:26 crc kubenswrapper[4779]: do Mar 20 15:24:26 crc kubenswrapper[4779]: for f in $(ls /tmp/serviceca); do Mar 20 15:24:26 crc kubenswrapper[4779]: echo $f Mar 20 15:24:26 crc kubenswrapper[4779]: ca_file_path="/tmp/serviceca/${f}" Mar 20 15:24:26 crc kubenswrapper[4779]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 15:24:26 crc kubenswrapper[4779]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 15:24:26 crc kubenswrapper[4779]: if [ -e "${reg_dir_path}" ]; then Mar 20 15:24:26 crc kubenswrapper[4779]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 15:24:26 crc kubenswrapper[4779]: else Mar 20 15:24:26 crc kubenswrapper[4779]: mkdir $reg_dir_path Mar 20 15:24:26 crc kubenswrapper[4779]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: for d in $(ls /etc/docker/certs.d); do Mar 20 15:24:26 crc kubenswrapper[4779]: echo $d Mar 20 15:24:26 crc kubenswrapper[4779]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 15:24:26 crc kubenswrapper[4779]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 15:24:26 crc kubenswrapper[4779]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 15:24:26 crc kubenswrapper[4779]: rm -rf /etc/docker/certs.d/$d Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: sleep 60 & wait ${!} Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkfvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-l9j6r_openshift-image-registry(46120a91-b00c-4299-b552-a374d2a78726): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.110064 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-l9j6r" podUID="46120a91-b00c-4299-b552-a374d2a78726" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.111336 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 15:24:26 crc kubenswrapper[4779]: set -o allexport Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: source /etc/kubernetes/apiserver-url.env Mar 20 15:24:26 crc kubenswrapper[4779]: else Mar 20 15:24:26 crc kubenswrapper[4779]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 15:24:26 crc kubenswrapper[4779]: exit 1 Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 15:24:26 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.112742 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.114836 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"37d6d5f595e43ae4f92dcd1e9a6303406def6bf9af1dc81184a131f58d49bc70"} Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.117037 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.117206 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" event={"ID":"9addf988-9b4c-4e5e-a5fe-793bab35a52f","Type":"ContainerStarted","Data":"e9c8d26f5a0b9d822cbaddb9571b9589ebe0d69788ca9facdb01b1f008f48816"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.118860 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"ea94b384664193d42db5252b285114925907da2bdee65fc77f8c177a8966a537"} Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.118995 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.119934 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.120289 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lkxk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.120352 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 15:24:26 crc kubenswrapper[4779]: set -euo pipefail Mar 20 15:24:26 crc kubenswrapper[4779]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 15:24:26 crc kubenswrapper[4779]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 15:24:26 crc kubenswrapper[4779]: # As the secret mount is optional we must wait for the files to be present. Mar 20 15:24:26 crc kubenswrapper[4779]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 15:24:26 crc kubenswrapper[4779]: TS=$(date +%s) Mar 20 15:24:26 crc kubenswrapper[4779]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 15:24:26 crc kubenswrapper[4779]: HAS_LOGGED_INFO=0 Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: log_missing_certs(){ Mar 20 15:24:26 crc kubenswrapper[4779]: CUR_TS=$(date +%s) Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 15:24:26 crc kubenswrapper[4779]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 15:24:26 crc kubenswrapper[4779]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 15:24:26 crc kubenswrapper[4779]: HAS_LOGGED_INFO=1 Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: } Mar 20 15:24:26 crc kubenswrapper[4779]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 15:24:26 crc kubenswrapper[4779]: log_missing_certs Mar 20 15:24:26 crc kubenswrapper[4779]: sleep 5 Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 15:24:26 crc kubenswrapper[4779]: exec /usr/bin/kube-rbac-proxy \ Mar 20 15:24:26 crc kubenswrapper[4779]: --logtostderr \ Mar 20 15:24:26 crc kubenswrapper[4779]: --secure-listen-address=:9108 \ Mar 20 15:24:26 crc kubenswrapper[4779]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 15:24:26 crc kubenswrapper[4779]: --upstream=http://127.0.0.1:29108/ \ Mar 20 15:24:26 crc kubenswrapper[4779]: --tls-private-key-file=${TLS_PK} \ Mar 20 15:24:26 crc kubenswrapper[4779]: --tls-cert-file=${TLS_CERT} Mar 20 15:24:26 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxnrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-92wlw_openshift-ovn-kubernetes(9addf988-9b4c-4e5e-a5fe-793bab35a52f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.120566 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"9ecf818626b26da45dfa904b2853098e14aa4610f03a9bbe9b5b49a1f31fbedf"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.121865 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1574750354eec6a8c9b6b60f405a386c009f5ba2e8727685999245866bee63a6"} Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.122237 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 15:24:26 crc kubenswrapper[4779]: apiVersion: v1 Mar 20 15:24:26 crc kubenswrapper[4779]: clusters: Mar 20 15:24:26 crc kubenswrapper[4779]: - cluster: Mar 20 15:24:26 crc kubenswrapper[4779]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 15:24:26 crc kubenswrapper[4779]: server: https://api-int.crc.testing:6443 Mar 20 15:24:26 crc kubenswrapper[4779]: name: default-cluster Mar 20 15:24:26 crc kubenswrapper[4779]: contexts: Mar 20 15:24:26 crc kubenswrapper[4779]: - context: Mar 20 15:24:26 crc kubenswrapper[4779]: cluster: default-cluster Mar 20 15:24:26 crc kubenswrapper[4779]: namespace: default Mar 20 15:24:26 crc kubenswrapper[4779]: user: default-auth Mar 20 15:24:26 crc kubenswrapper[4779]: name: default-context Mar 20 15:24:26 crc kubenswrapper[4779]: current-context: default-context Mar 20 15:24:26 crc kubenswrapper[4779]: kind: Config Mar 20 15:24:26 crc kubenswrapper[4779]: preferences: {} Mar 20 15:24:26 crc kubenswrapper[4779]: users: Mar 20 15:24:26 crc kubenswrapper[4779]: - name: default-auth Mar 20 15:24:26 crc kubenswrapper[4779]: user: Mar 20 15:24:26 crc kubenswrapper[4779]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 15:24:26 crc kubenswrapper[4779]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 15:24:26 crc kubenswrapper[4779]: EOF Mar 20 15:24:26 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcxgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.122739 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfj25" event={"ID":"c30ee189-9db1-41af-8a55-29955cbf6712","Type":"ContainerStarted","Data":"d579a8df0a98986cc50349bc0493a786ae5580e6adc314fb0a3539c99ebc9c07"} Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.122824 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ -f "/env/_master" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: set -o allexport Mar 20 15:24:26 crc kubenswrapper[4779]: source "/env/_master" Mar 20 15:24:26 crc kubenswrapper[4779]: set +o allexport Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: ovn_v4_join_subnet_opt= Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "" != "" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: ovn_v6_join_subnet_opt= Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "" != "" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: ovn_v4_transit_switch_subnet_opt= Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "" != "" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: ovn_v6_transit_switch_subnet_opt= Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "" != "" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: dns_name_resolver_enabled_flag= Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "false" == "true" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: persistent_ips_enabled_flag= Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "true" == "true" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: # This is needed so that converting clusters from GA to TP Mar 20 15:24:26 crc kubenswrapper[4779]: # will rollout control plane pods as well Mar 20 15:24:26 crc kubenswrapper[4779]: network_segmentation_enabled_flag= Mar 20 15:24:26 crc kubenswrapper[4779]: multi_network_enabled_flag= Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "true" == "true" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: multi_network_enabled_flag="--enable-multi-network" Mar 20 15:24:26 crc kubenswrapper[4779]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 15:24:26 crc kubenswrapper[4779]: exec /usr/bin/ovnkube \ Mar 20 15:24:26 crc kubenswrapper[4779]: --enable-interconnect \ Mar 20 15:24:26 crc kubenswrapper[4779]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 15:24:26 crc kubenswrapper[4779]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 15:24:26 crc kubenswrapper[4779]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 15:24:26 crc kubenswrapper[4779]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 15:24:26 crc kubenswrapper[4779]: --metrics-enable-pprof \ Mar 20 15:24:26 crc kubenswrapper[4779]: --metrics-enable-config-duration \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${ovn_v4_join_subnet_opt} \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${ovn_v6_join_subnet_opt} \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${dns_name_resolver_enabled_flag} \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${persistent_ips_enabled_flag} \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${multi_network_enabled_flag} \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${network_segmentation_enabled_flag} Mar 20 15:24:26 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxnrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-92wlw_openshift-ovn-kubernetes(9addf988-9b4c-4e5e-a5fe-793bab35a52f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.123213 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lkxk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.123270 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.123436 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ -f "/env/_master" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: set -o allexport Mar 20 15:24:26 crc kubenswrapper[4779]: source "/env/_master" Mar 20 15:24:26 crc kubenswrapper[4779]: set +o allexport Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 15:24:26 crc kubenswrapper[4779]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 15:24:26 crc kubenswrapper[4779]: ho_enable="--enable-hybrid-overlay" Mar 20 15:24:26 crc kubenswrapper[4779]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 15:24:26 crc kubenswrapper[4779]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 15:24:26 crc kubenswrapper[4779]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 15:24:26 crc kubenswrapper[4779]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 15:24:26 crc kubenswrapper[4779]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 15:24:26 crc kubenswrapper[4779]: --webhook-host=127.0.0.1 \ Mar 20 15:24:26 crc kubenswrapper[4779]: --webhook-port=9743 \ Mar 20 15:24:26 crc kubenswrapper[4779]: ${ho_enable} \ Mar 20 15:24:26 crc kubenswrapper[4779]: --enable-interconnect \ Mar 20 15:24:26 crc kubenswrapper[4779]: --disable-approver \ Mar 20 15:24:26 crc kubenswrapper[4779]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 15:24:26 crc kubenswrapper[4779]: --wait-for-kubernetes-api=200s \ Mar 20 15:24:26 crc kubenswrapper[4779]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 15:24:26 crc kubenswrapper[4779]: --loglevel="${LOGLEVEL}" Mar 20 15:24:26 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.123901 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 15:24:26 crc kubenswrapper[4779]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 15:24:26 crc kubenswrapper[4779]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n4l9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-lfj25_openshift-multus(c30ee189-9db1-41af-8a55-29955cbf6712): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.123930 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" podUID="9addf988-9b4c-4e5e-a5fe-793bab35a52f" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.124017 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9dnjh" event={"ID":"5293026d-bcf7-4270-8ad8-59a90e70ab1a","Type":"ContainerStarted","Data":"27fe22b7f2e63b8f004629562bd929e188eba50cf51b105100cb6efa6f6f67fc"} Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.124335 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.124988 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-lfj25" podUID="c30ee189-9db1-41af-8a55-29955cbf6712" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.125066 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 15:24:26 crc kubenswrapper[4779]: set -uo pipefail Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 15:24:26 crc kubenswrapper[4779]: HOSTS_FILE="/etc/hosts" Mar 20 15:24:26 crc kubenswrapper[4779]: TEMP_FILE="/etc/hosts.tmp" Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: # Make a temporary file with the old hosts file's attributes. Mar 20 15:24:26 crc kubenswrapper[4779]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 15:24:26 crc kubenswrapper[4779]: echo "Failed to preserve hosts file. Exiting." Mar 20 15:24:26 crc kubenswrapper[4779]: exit 1 Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: while true; do Mar 20 15:24:26 crc kubenswrapper[4779]: declare -A svc_ips Mar 20 15:24:26 crc kubenswrapper[4779]: for svc in "${services[@]}"; do Mar 20 15:24:26 crc kubenswrapper[4779]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 15:24:26 crc kubenswrapper[4779]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 15:24:26 crc kubenswrapper[4779]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 15:24:26 crc kubenswrapper[4779]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 15:24:26 crc kubenswrapper[4779]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 15:24:26 crc kubenswrapper[4779]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 15:24:26 crc kubenswrapper[4779]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 15:24:26 crc kubenswrapper[4779]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 15:24:26 crc kubenswrapper[4779]: for i in ${!cmds[*]} Mar 20 15:24:26 crc kubenswrapper[4779]: do Mar 20 15:24:26 crc kubenswrapper[4779]: ips=($(eval "${cmds[i]}")) Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: svc_ips["${svc}"]="${ips[@]}" Mar 20 15:24:26 crc kubenswrapper[4779]: break Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: # Update /etc/hosts only if we get valid service IPs Mar 20 15:24:26 crc kubenswrapper[4779]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 15:24:26 crc kubenswrapper[4779]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 15:24:26 crc kubenswrapper[4779]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 15:24:26 crc kubenswrapper[4779]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 15:24:26 crc kubenswrapper[4779]: sleep 60 & wait Mar 20 15:24:26 crc kubenswrapper[4779]: continue Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: # Append resolver entries for services Mar 20 15:24:26 crc kubenswrapper[4779]: rc=0 Mar 20 15:24:26 crc kubenswrapper[4779]: for svc in "${!svc_ips[@]}"; do Mar 20 15:24:26 crc kubenswrapper[4779]: for ip in ${svc_ips[${svc}]}; do Mar 20 15:24:26 crc kubenswrapper[4779]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ $rc -ne 0 ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: sleep 60 & wait Mar 20 15:24:26 crc kubenswrapper[4779]: continue Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 15:24:26 crc kubenswrapper[4779]: # Replace /etc/hosts with our modified version if needed Mar 20 15:24:26 crc kubenswrapper[4779]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 15:24:26 crc kubenswrapper[4779]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: sleep 60 & wait Mar 20 15:24:26 crc kubenswrapper[4779]: unset svc_ips Mar 20 15:24:26 crc kubenswrapper[4779]: done Mar 20 15:24:26 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l595l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9dnjh_openshift-dns(5293026d-bcf7-4270-8ad8-59a90e70ab1a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.125358 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerStarted","Data":"4355f2a998af3d5d89916df729e21b6d54fd4f0dadb81c8bac3bf477ce17bc8d"} Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.125411 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:26 crc kubenswrapper[4779]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 15:24:26 crc kubenswrapper[4779]: if [[ -f "/env/_master" ]]; then Mar 20 15:24:26 crc kubenswrapper[4779]: set -o allexport Mar 20 15:24:26 crc kubenswrapper[4779]: source "/env/_master" Mar 20 15:24:26 crc kubenswrapper[4779]: set +o allexport Mar 20 15:24:26 crc kubenswrapper[4779]: fi Mar 20 15:24:26 crc kubenswrapper[4779]: Mar 20 15:24:26 crc kubenswrapper[4779]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 15:24:26 crc kubenswrapper[4779]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 15:24:26 crc kubenswrapper[4779]: --disable-webhook \ Mar 20 15:24:26 crc kubenswrapper[4779]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 15:24:26 crc kubenswrapper[4779]: --loglevel="${LOGLEVEL}" Mar 20 15:24:26 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:26 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.126179 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9dnjh" podUID="5293026d-bcf7-4270-8ad8-59a90e70ab1a" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.126364 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rnnk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-clzkt_openshift-multus(e7a20ef3-86de-4db2-b500-63af002500b4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.127529 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-clzkt" podUID="e7a20ef3-86de-4db2-b500-63af002500b4" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.127542 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.128070 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.138382 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.150963 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.161167 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.166864 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.177722 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.185916 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.186315 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.186342 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.186355 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.186372 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.186382 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.193167 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.200032 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.208531 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.214435 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.219869 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.232287 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.240668 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.247220 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.253063 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.265641 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.273493 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.280021 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.288171 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.288206 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.288216 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.288233 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.288244 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.289898 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.298462 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.306811 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.313751 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.323185 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.331623 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.340644 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.348825 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.390054 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.390102 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.390143 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.390159 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.390168 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.492441 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.492484 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.492494 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.492512 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.492523 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.509890 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.509959 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.509985 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.510019 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510100 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510130 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:24:28.51008141 +0000 UTC m=+85.472597210 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510166 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:28.510157211 +0000 UTC m=+85.472673011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510215 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510233 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510244 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510268 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510287 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:28.510275074 +0000 UTC m=+85.472790874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.510382 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:28.510370946 +0000 UTC m=+85.472886746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.594953 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.594992 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.595005 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.595018 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.595029 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.610639 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.610694 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.610825 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.610846 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.610849 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.610951 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:28.610925318 +0000 UTC m=+85.573441168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.610860 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.611028 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:28.61101233 +0000 UTC m=+85.573528150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.698208 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.698257 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.698268 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.698284 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.698296 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.800913 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.800963 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.800978 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.800998 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.801015 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.808564 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.808622 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.808653 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.808686 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.808579 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.808759 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.808798 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:26 crc kubenswrapper[4779]: E0320 15:24:26.808909 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.903696 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.903749 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.903761 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.903778 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:26 crc kubenswrapper[4779]: I0320 15:24:26.903885 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:26Z","lastTransitionTime":"2026-03-20T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.006496 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.006563 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.006586 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.006615 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.006636 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.109830 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.109888 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.109904 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.109925 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.109942 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.212417 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.212526 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.212548 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.212571 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.212588 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.314583 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.314650 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.314672 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.314701 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.314722 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.417015 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.417061 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.417073 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.417091 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.417104 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.519601 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.519629 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.519636 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.519649 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.519658 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.622190 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.622238 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.622250 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.622270 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.622282 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.724848 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.724880 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.724889 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.724902 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.724912 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.827581 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.827620 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.827628 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.827644 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.827653 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.930399 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.930461 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.930477 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.930498 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:27 crc kubenswrapper[4779]: I0320 15:24:27.930513 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:27Z","lastTransitionTime":"2026-03-20T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.032898 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.033262 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.033397 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.033544 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.033668 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.136037 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.136090 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.136126 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.136141 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.136150 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.239056 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.239184 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.239211 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.239236 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.239253 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.341620 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.341663 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.341674 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.341690 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.341702 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.443611 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.443649 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.443664 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.443679 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.443689 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.532313 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.532594 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:24:32.532550491 +0000 UTC m=+89.495066351 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.532970 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.533093 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.533292 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.533211 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.533575 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:32.533556586 +0000 UTC m=+89.496072396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.533303 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.533782 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.533345 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.533950 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:32.533927484 +0000 UTC m=+89.496443304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.533862 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.534000 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:32.533990947 +0000 UTC m=+89.496506767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.545733 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.545771 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.545783 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.545800 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.545813 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.634411 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.634646 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.634504 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.634890 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:32.634874215 +0000 UTC m=+89.597390015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.634759 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.635047 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.635126 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.635207 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:32.635199493 +0000 UTC m=+89.597715293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.648853 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.649015 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.649130 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.649241 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.649405 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.751510 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.752050 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.752213 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.752302 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.752381 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.808079 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.808348 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.808277 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.808248 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.808609 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.808685 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.808846 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:28 crc kubenswrapper[4779]: E0320 15:24:28.808909 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.854664 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.854696 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.854703 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.854715 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.854723 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.956905 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.956936 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.956947 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.956961 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:28 crc kubenswrapper[4779]: I0320 15:24:28.956971 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:28Z","lastTransitionTime":"2026-03-20T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.058880 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.059145 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.059229 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.059297 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.059360 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.162203 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.162252 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.162265 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.162285 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.162300 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.264163 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.264196 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.264204 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.264219 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.264227 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.366975 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.367048 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.367075 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.367150 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.367179 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.469371 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.469416 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.469428 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.469444 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.469456 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.571398 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.571663 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.571743 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.571932 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.572007 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.674036 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.674065 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.674072 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.674085 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.674093 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.776243 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.776294 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.776308 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.776328 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.776341 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.878683 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.879248 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.879364 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.879457 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.879531 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.982211 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.982245 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.982254 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.982268 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:29 crc kubenswrapper[4779]: I0320 15:24:29.982277 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:29Z","lastTransitionTime":"2026-03-20T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.084764 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.084994 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.085073 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.085161 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.085229 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.187644 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.187949 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.188058 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.188323 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.188529 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.290620 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.290883 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.290949 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.291015 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.291100 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.394542 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.394587 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.394598 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.394611 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.394620 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.497491 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.497534 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.497543 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.497559 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.497572 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.599624 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.600074 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.600176 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.600264 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.600347 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.702648 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.702889 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.702976 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.703058 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.703143 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.805445 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.805696 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.805785 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.805868 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.805947 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.807703 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:30 crc kubenswrapper[4779]: E0320 15:24:30.807946 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.808342 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:30 crc kubenswrapper[4779]: E0320 15:24:30.808488 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.808624 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:30 crc kubenswrapper[4779]: E0320 15:24:30.808749 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.808698 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:30 crc kubenswrapper[4779]: E0320 15:24:30.808961 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.908838 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.908880 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.908889 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.908906 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:30 crc kubenswrapper[4779]: I0320 15:24:30.908920 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:30Z","lastTransitionTime":"2026-03-20T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.011135 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.011324 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.011380 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.011440 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.011536 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.114051 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.114320 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.114428 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.114513 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.114593 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.217129 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.217183 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.217191 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.217203 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.217210 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.320139 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.320416 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.320512 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.320638 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.320725 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.423678 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.423725 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.423736 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.423749 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.423758 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.527367 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.527461 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.527472 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.527494 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.527505 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.631536 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.631866 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.632043 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.632246 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.632415 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.734688 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.734746 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.734762 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.734784 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.734796 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.836702 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.836761 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.836775 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.836794 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.836810 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.939198 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.939238 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.939251 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.939266 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:31 crc kubenswrapper[4779]: I0320 15:24:31.939278 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:31Z","lastTransitionTime":"2026-03-20T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.041781 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.041818 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.041829 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.041844 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.041853 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.144047 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.144079 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.144087 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.144099 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.144138 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.246911 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.246980 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.247005 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.247037 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.247059 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.349842 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.349896 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.349912 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.349933 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.349948 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.452687 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.452733 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.452744 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.452763 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.452775 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.555483 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.555528 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.555543 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.555562 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.555574 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.570921 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.571026 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.571058 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.571092 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571183 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:24:40.571166823 +0000 UTC m=+97.533682623 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571237 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571243 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571254 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571276 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571288 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571265 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:40.571258715 +0000 UTC m=+97.533774515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571317 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:40.571299986 +0000 UTC m=+97.533815786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.571329 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:40.571323727 +0000 UTC m=+97.533839527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.657516 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.657551 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.657560 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.657576 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.657587 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.672303 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.672370 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.672475 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.672525 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.672541 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:40.672521083 +0000 UTC m=+97.635036983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.672546 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.672561 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.672611 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:40.672595395 +0000 UTC m=+97.635111285 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.760446 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.760485 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.760501 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.760516 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.760526 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.794703 4779 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.808786 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.808795 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.808840 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.808891 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.809042 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.809209 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.809330 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:32 crc kubenswrapper[4779]: E0320 15:24:32.809559 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.863179 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.863232 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.863273 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.863303 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.863312 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.966053 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.966130 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.966142 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.966156 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:32 crc kubenswrapper[4779]: I0320 15:24:32.966182 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:32Z","lastTransitionTime":"2026-03-20T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.069192 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.069301 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.069332 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.069365 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.069386 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.172703 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.172753 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.172766 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.172786 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.172798 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.275301 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.275351 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.275362 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.275378 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.275388 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.377739 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.377776 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.377785 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.377799 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.377808 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.480625 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.480654 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.480662 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.480676 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.480685 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.582513 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.582570 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.582584 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.582602 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.582614 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.684017 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.684065 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.684076 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.684095 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.684134 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.785822 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.785857 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.785865 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.785881 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.785890 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.817194 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.828677 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.838667 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.848336 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.857870 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.864852 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.876415 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.884666 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.889884 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.890019 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.890093 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.890205 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.890280 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.892563 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.902823 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.914005 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.927956 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.938064 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.954955 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.991948 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.991999 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.992009 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.992025 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:33 crc kubenswrapper[4779]: I0320 15:24:33.992037 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:33Z","lastTransitionTime":"2026-03-20T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.037986 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.038043 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.038053 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.038069 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.038080 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.049480 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.052970 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.052995 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.053003 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.053016 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.053025 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.062420 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.067504 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.067537 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.067546 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.067561 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.067572 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.078727 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.082970 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.083079 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.083178 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.083252 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.083316 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.093500 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.097183 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.097392 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.097483 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.097573 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.097665 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.107740 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.107858 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.109166 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.109195 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.109204 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.109216 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.109225 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.212060 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.212373 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.212449 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.212532 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.212610 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.315074 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.315184 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.315211 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.315239 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.315262 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.418714 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.419247 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.419430 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.419624 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.419825 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.523214 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.523255 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.523264 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.523277 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.523286 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.626041 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.626509 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.626622 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.626749 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.626820 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.729631 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.730064 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.730156 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.730236 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.730302 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.807879 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.808014 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.807879 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.808163 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.808105 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.808366 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.808990 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:34 crc kubenswrapper[4779]: E0320 15:24:34.809299 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.834740 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.834789 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.834809 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.834834 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.834853 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.937570 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.937621 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.937634 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.937653 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:34 crc kubenswrapper[4779]: I0320 15:24:34.937670 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:34Z","lastTransitionTime":"2026-03-20T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.040376 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.040418 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.040432 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.040452 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.040466 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.144217 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.144267 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.144280 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.144302 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.144317 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.247582 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.247645 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.247654 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.247669 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.247682 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.351095 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.351162 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.351178 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.351201 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.351214 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.454336 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.454413 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.454438 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.454465 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.454482 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.557485 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.557549 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.557561 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.557581 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.557594 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.660720 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.660789 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.660805 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.660828 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.660842 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.763652 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.763740 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.763767 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.763802 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.763828 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.867319 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.867358 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.867367 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.867382 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.867393 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.971006 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.971050 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.971059 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.971076 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:35 crc kubenswrapper[4779]: I0320 15:24:35.971086 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:35Z","lastTransitionTime":"2026-03-20T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.075190 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.075259 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.075290 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.075313 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.075335 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.179538 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.179592 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.179603 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.179622 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.179634 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.283041 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.283141 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.283158 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.283183 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.283201 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.386147 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.386207 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.386219 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.386239 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.386252 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.488999 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.489048 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.489057 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.489076 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.489090 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.591719 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.591778 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.591795 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.591818 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.591834 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.694407 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.694483 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.694504 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.694537 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.694561 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.796948 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.796981 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.796988 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.797004 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.797013 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.808258 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.808307 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.808370 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.808630 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.808710 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.808792 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.808987 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.809059 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.809694 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:24:36 crc kubenswrapper[4779]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 15:24:36 crc kubenswrapper[4779]: while [ true ]; Mar 20 15:24:36 crc kubenswrapper[4779]: do Mar 20 15:24:36 crc kubenswrapper[4779]: for f in $(ls /tmp/serviceca); do Mar 20 15:24:36 crc kubenswrapper[4779]: echo $f Mar 20 15:24:36 crc kubenswrapper[4779]: ca_file_path="/tmp/serviceca/${f}" Mar 20 15:24:36 crc kubenswrapper[4779]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 15:24:36 crc kubenswrapper[4779]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 15:24:36 crc kubenswrapper[4779]: if [ -e "${reg_dir_path}" ]; then Mar 20 15:24:36 crc kubenswrapper[4779]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 15:24:36 crc kubenswrapper[4779]: else Mar 20 15:24:36 crc kubenswrapper[4779]: mkdir $reg_dir_path Mar 20 15:24:36 crc kubenswrapper[4779]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 15:24:36 crc kubenswrapper[4779]: fi Mar 20 15:24:36 crc kubenswrapper[4779]: done Mar 20 15:24:36 crc kubenswrapper[4779]: for d in $(ls /etc/docker/certs.d); do Mar 20 15:24:36 crc kubenswrapper[4779]: echo $d Mar 20 15:24:36 crc kubenswrapper[4779]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 15:24:36 crc kubenswrapper[4779]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 15:24:36 crc kubenswrapper[4779]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 15:24:36 crc kubenswrapper[4779]: rm -rf /etc/docker/certs.d/$d Mar 20 15:24:36 crc kubenswrapper[4779]: fi Mar 20 15:24:36 crc kubenswrapper[4779]: done Mar 20 15:24:36 crc kubenswrapper[4779]: sleep 60 & wait ${!} Mar 20 15:24:36 crc kubenswrapper[4779]: done Mar 20 15:24:36 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkfvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-l9j6r_openshift-image-registry(46120a91-b00c-4299-b552-a374d2a78726): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 15:24:36 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.809685 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lkxk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.810937 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-l9j6r" podUID="46120a91-b00c-4299-b552-a374d2a78726" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.811418 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lkxk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 15:24:36 crc kubenswrapper[4779]: E0320 15:24:36.812608 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.899006 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.899057 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.899072 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.899089 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:36 crc kubenswrapper[4779]: I0320 15:24:36.899101 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:36Z","lastTransitionTime":"2026-03-20T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.001916 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.001952 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.001960 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.001977 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.001987 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.104957 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.105520 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.105546 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.105560 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.105569 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.207068 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.207128 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.207141 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.207155 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.207166 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.309096 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.309151 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.309162 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.309177 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.309189 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.412163 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.412206 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.412237 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.412254 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.412265 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.469952 4779 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.514161 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.514251 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.514266 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.514290 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.514305 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.616734 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.616763 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.616770 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.616783 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.616793 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.719290 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.719334 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.719344 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.719359 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.719368 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.821305 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.821338 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.821346 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.821360 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.821373 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.924361 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.924629 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.924722 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.924942 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:37 crc kubenswrapper[4779]: I0320 15:24:37.925098 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:37Z","lastTransitionTime":"2026-03-20T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.028490 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.028572 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.028590 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.028614 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.028631 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.132261 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.134213 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.134258 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.134283 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.134303 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.236492 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.236531 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.236541 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.236556 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.236566 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.339025 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.339063 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.339072 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.339089 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.339099 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.441444 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.441481 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.441490 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.441506 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.441521 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.544466 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.544538 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.544553 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.544591 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.544609 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.647047 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.647082 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.647092 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.647105 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.647131 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.749584 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.749621 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.749630 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.749644 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.749653 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.808416 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.808459 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.808516 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.808628 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:38 crc kubenswrapper[4779]: E0320 15:24:38.808834 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:38 crc kubenswrapper[4779]: E0320 15:24:38.808951 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:38 crc kubenswrapper[4779]: E0320 15:24:38.809059 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:38 crc kubenswrapper[4779]: E0320 15:24:38.809214 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.827890 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.827989 4779 scope.go:117] "RemoveContainer" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" Mar 20 15:24:38 crc kubenswrapper[4779]: E0320 15:24:38.828229 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.852221 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.852270 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.852282 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.852300 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.852314 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.954089 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.954146 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.954159 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.954176 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:38 crc kubenswrapper[4779]: I0320 15:24:38.954188 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:38Z","lastTransitionTime":"2026-03-20T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.056455 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.056488 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.056498 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.056513 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.056524 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.158233 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.158282 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.158294 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.158311 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.158324 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.159100 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" event={"ID":"9addf988-9b4c-4e5e-a5fe-793bab35a52f","Type":"ContainerStarted","Data":"0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.159166 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" event={"ID":"9addf988-9b4c-4e5e-a5fe-793bab35a52f","Type":"ContainerStarted","Data":"54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.159765 4779 scope.go:117] "RemoveContainer" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" Mar 20 15:24:39 crc kubenswrapper[4779]: E0320 15:24:39.159951 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.170599 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.183616 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.192239 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.200583 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.210916 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.220757 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.230560 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.239920 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.247354 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.255286 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.260708 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.260753 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.260768 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.260783 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.260793 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.262412 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.269231 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.283076 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.292618 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.302775 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.363260 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.363302 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.363312 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.363326 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.363336 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.465128 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.465157 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.465166 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.465180 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.465190 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.567357 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.567419 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.567432 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.567448 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.567459 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.669773 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.669809 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.669847 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.669861 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.669873 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.771880 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.771918 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.771928 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.771942 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.771953 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.874754 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.875321 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.875331 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.875346 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.875355 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.977597 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.977645 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.977659 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.977678 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:39 crc kubenswrapper[4779]: I0320 15:24:39.977691 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:39Z","lastTransitionTime":"2026-03-20T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.080588 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.080630 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.080637 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.080652 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.080662 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.162471 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerStarted","Data":"476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.164075 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9dnjh" event={"ID":"5293026d-bcf7-4270-8ad8-59a90e70ab1a","Type":"ContainerStarted","Data":"f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.166439 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.175069 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.182692 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.182741 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.182752 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.182767 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.182776 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.184618 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.193858 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.202067 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.212496 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.219551 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.226958 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.236943 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.246528 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.255843 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.262861 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.272361 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.284996 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.285030 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.285038 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.285051 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.285061 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.286892 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.294822 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.302806 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.309460 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.318570 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.328360 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.338818 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.350272 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.357098 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.368503 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.376330 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.382863 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.386767 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.386795 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.386803 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.386818 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.386830 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.389661 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.397762 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.411261 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.416970 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.426829 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.450149 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.488594 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.488621 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.488629 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.488642 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.488650 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.591055 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.591083 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.591090 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.591103 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.591140 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.657404 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.657559 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.657623 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.657720 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.657918 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.657964 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.657989 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.658074 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:56.658046456 +0000 UTC m=+113.620562296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.658675 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:24:56.65865606 +0000 UTC m=+113.621171900 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.658761 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.658846 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:56.658825194 +0000 UTC m=+113.621341034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.658962 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.659024 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:56.659007698 +0000 UTC m=+113.621523538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.693149 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.693188 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.693198 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.693213 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.693224 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.758407 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.758460 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.758572 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.758591 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.758617 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.758661 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:56.758647008 +0000 UTC m=+113.721162808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.758957 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.758988 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:24:56.758980146 +0000 UTC m=+113.721495946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.795436 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.795464 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.795472 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.795485 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.795493 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.808864 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.808940 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.809259 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.809324 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.809765 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.809834 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.810218 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:40 crc kubenswrapper[4779]: E0320 15:24:40.810294 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.898035 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.898068 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.898078 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.898092 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:40 crc kubenswrapper[4779]: I0320 15:24:40.898118 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:40Z","lastTransitionTime":"2026-03-20T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.000230 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.000267 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.000278 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.000293 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.000304 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.102276 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.102306 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.102313 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.102325 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.102335 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.170339 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.170393 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.172277 4779 generic.go:334] "Generic (PLEG): container finished" podID="e7a20ef3-86de-4db2-b500-63af002500b4" containerID="476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5" exitCode=0 Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.172325 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerDied","Data":"476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.174390 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874" exitCode=0 Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.174419 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.192884 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.208407 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.209690 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.209748 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.209766 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.209790 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.209809 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.224982 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.236247 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.247678 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.258626 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.274984 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.284003 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.292698 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.303319 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.314322 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.314968 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.314997 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.315032 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.315048 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.314675 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.326794 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.336824 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.346019 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.359503 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.367887 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.376020 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.382765 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.388827 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.394739 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.407489 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.418559 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.418588 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.418595 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.418608 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.418617 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.418660 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.430747 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.439604 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.446660 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.455222 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.462003 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.470971 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.481275 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.489789 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.520842 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.520875 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.520883 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.520905 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.520914 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.622831 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.622861 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.622870 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.622882 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.622891 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.726102 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.726377 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.726499 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.726600 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.726724 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.829792 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.829846 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.829858 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.829876 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.829889 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.933609 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.933643 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.933652 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.933667 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:41 crc kubenswrapper[4779]: I0320 15:24:41.933676 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:41Z","lastTransitionTime":"2026-03-20T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.036351 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.036599 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.036609 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.036623 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.036634 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.138659 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.138694 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.138702 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.138717 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.138729 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.180400 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfj25" event={"ID":"c30ee189-9db1-41af-8a55-29955cbf6712","Type":"ContainerStarted","Data":"e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.182373 4779 generic.go:334] "Generic (PLEG): container finished" podID="e7a20ef3-86de-4db2-b500-63af002500b4" containerID="20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73" exitCode=0 Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.182446 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerDied","Data":"20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.190872 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.190998 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.191077 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.191171 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.191233 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.191288 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.198189 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.209518 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.229842 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.243652 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.245871 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.245908 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.245917 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.245931 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.245941 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.254362 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.274295 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.290959 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.303101 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.315220 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.325377 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.336703 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.346159 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.348891 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.348942 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.348951 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.348975 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.348990 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.356325 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.373991 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.389765 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.402629 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.416682 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.432210 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.448567 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.451331 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.451360 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.451371 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.451383 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.451393 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.461771 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.476711 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.486756 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.497298 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.508858 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.519854 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.529133 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.537731 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.552203 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.553352 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.553414 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.553427 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.553445 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.553458 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.563371 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.573027 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.655926 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.655958 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.655970 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.655984 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.655995 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.758166 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.758200 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.758211 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.758227 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.758239 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.807895 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:42 crc kubenswrapper[4779]: E0320 15:24:42.808134 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.807922 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.807929 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.807896 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:42 crc kubenswrapper[4779]: E0320 15:24:42.808467 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:42 crc kubenswrapper[4779]: E0320 15:24:42.808575 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:42 crc kubenswrapper[4779]: E0320 15:24:42.808300 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.861041 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.861093 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.861139 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.861164 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.861183 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.964074 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.964140 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.964150 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.964170 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:42 crc kubenswrapper[4779]: I0320 15:24:42.964181 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:42Z","lastTransitionTime":"2026-03-20T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.066965 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.067035 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.067050 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.067070 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.067087 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.169461 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.169505 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.169516 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.169532 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.169543 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.196307 4779 generic.go:334] "Generic (PLEG): container finished" podID="e7a20ef3-86de-4db2-b500-63af002500b4" containerID="e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf" exitCode=0 Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.196373 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerDied","Data":"e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.197711 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.216471 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.242321 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.267894 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.271507 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.271538 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.271548 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.271562 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.271573 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.281138 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.294715 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.304793 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.316588 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.337021 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.350381 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.360858 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.372301 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.373943 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.373973 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.373984 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.373997 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.374007 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.383488 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.394600 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.404484 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.417394 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.428010 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.439762 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.450565 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.460857 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.474306 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.475966 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.475998 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.476007 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.476021 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.476030 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.485831 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.499450 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.512044 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.525519 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.539016 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.550566 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.561981 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.578068 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.578143 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.578153 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.578167 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.578176 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.579847 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.591291 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.602577 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.680510 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.680555 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.680566 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.680582 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.680594 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.783056 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.783098 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.783134 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.783157 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.783171 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.821364 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.834572 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.845937 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.862014 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.871543 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.887355 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.889337 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.889390 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.889411 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.889428 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.889437 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.907254 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.920216 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.931044 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.942629 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.956065 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.968319 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.979390 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.991485 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.991512 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.991520 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.991533 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.991541 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:43Z","lastTransitionTime":"2026-03-20T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:43 crc kubenswrapper[4779]: I0320 15:24:43.995576 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.007853 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.093955 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.093989 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.093999 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.094012 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.094023 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.196478 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.196517 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.196525 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.196540 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.196550 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.202229 4779 generic.go:334] "Generic (PLEG): container finished" podID="e7a20ef3-86de-4db2-b500-63af002500b4" containerID="edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d" exitCode=0 Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.202325 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerDied","Data":"edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.206797 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.227468 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.242971 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.254317 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.256101 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.257214 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.257248 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.257269 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.257281 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.264977 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.272647 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.276272 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.276309 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.276318 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.276331 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.276341 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.284740 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.289584 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.293197 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.293238 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.293250 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.293270 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.293282 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.299891 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.305222 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.308409 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.308433 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.308442 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.308454 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.308462 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.311516 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.320491 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.323923 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.323949 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.323958 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.323971 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.323980 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.327388 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.334661 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.334770 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.336873 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.336905 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.336913 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.336926 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.336935 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.338010 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.386362 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.399334 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.413434 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.424550 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.435129 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.439466 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.439514 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.439533 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.439550 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.439561 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.446829 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.541982 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.542017 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.542027 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.542043 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.542056 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.652066 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.652707 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.652719 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.652732 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.652742 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.754918 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.754959 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.754969 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.754983 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.754993 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.807874 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.807968 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.808053 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.808153 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.808169 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.808209 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.808456 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:44 crc kubenswrapper[4779]: E0320 15:24:44.808596 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.858718 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.858777 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.858794 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.858817 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.858834 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.960936 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.960965 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.960973 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.960985 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:44 crc kubenswrapper[4779]: I0320 15:24:44.960993 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:44Z","lastTransitionTime":"2026-03-20T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.063525 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.063566 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.063578 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.063596 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.063607 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.166248 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.166327 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.166350 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.166380 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.166405 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.213243 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerStarted","Data":"d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.232706 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.245654 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.261457 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.270099 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.270145 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.270154 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.270169 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.270178 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.279276 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.292959 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.305328 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.320980 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.333657 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.349710 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.361975 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.371677 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.371712 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.371721 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.371736 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.371748 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.375034 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.395500 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.410903 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.423908 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.435721 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.474370 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.474418 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.474431 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.474445 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.474486 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.576671 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.576706 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.576717 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.576734 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.576769 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.678696 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.678733 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.678743 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.678757 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.678765 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.781174 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.781208 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.781218 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.781236 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.781247 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.883256 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.883306 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.883315 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.883329 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.883340 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.986620 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.986650 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.986658 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.986671 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:45 crc kubenswrapper[4779]: I0320 15:24:45.986680 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:45Z","lastTransitionTime":"2026-03-20T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.088980 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.089018 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.089027 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.089040 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.089048 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.191479 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.191513 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.191522 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.191536 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.191545 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.220503 4779 generic.go:334] "Generic (PLEG): container finished" podID="e7a20ef3-86de-4db2-b500-63af002500b4" containerID="d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55" exitCode=0 Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.220583 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerDied","Data":"d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.228736 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.229023 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.229054 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.229066 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.233884 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.250468 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.257450 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.257516 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.267143 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.279210 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.293457 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.297894 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.297931 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.297943 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.297958 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.297967 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.302883 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.316195 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.334314 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.348838 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.359915 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.373303 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.391082 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.400277 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.400312 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.400324 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.400340 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.400352 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.404540 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.417345 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.435290 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.455877 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.479221 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.502645 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.502681 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.502692 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.502706 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.502716 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.512858 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.527456 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.538196 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.551374 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.564146 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.577177 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.589165 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.602302 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.604814 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.604851 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.604860 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.604875 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.604884 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.614484 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.623560 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.635193 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.647956 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.657837 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.707760 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.707801 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.707812 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.707829 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.707841 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.808521 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.808565 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.808531 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:46 crc kubenswrapper[4779]: E0320 15:24:46.808777 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.808851 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:46 crc kubenswrapper[4779]: E0320 15:24:46.809258 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:46 crc kubenswrapper[4779]: E0320 15:24:46.809623 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:46 crc kubenswrapper[4779]: E0320 15:24:46.809713 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.810626 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.810677 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.810689 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.810709 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.810723 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.912784 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.912815 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.912823 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.912837 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:46 crc kubenswrapper[4779]: I0320 15:24:46.912847 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:46Z","lastTransitionTime":"2026-03-20T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.015647 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.015681 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.015688 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.015701 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.015713 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.118604 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.118660 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.118675 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.118701 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.118718 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.222200 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.222286 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.222311 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.222347 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.222368 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.235601 4779 generic.go:334] "Generic (PLEG): container finished" podID="e7a20ef3-86de-4db2-b500-63af002500b4" containerID="97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f" exitCode=0 Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.235682 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerDied","Data":"97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.260587 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.281562 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.297970 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.315372 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.325369 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.325418 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.325436 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.325459 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.325478 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.337049 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.352069 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.367333 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.378751 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.394828 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.416182 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.428631 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.428890 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.428919 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.428935 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.428959 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.428976 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.438869 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.460066 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.477701 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.489812 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.530824 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.530866 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.530878 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.530893 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.530905 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.633402 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.633437 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.633449 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.633465 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.633476 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.735455 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.735496 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.735506 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.735521 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.735531 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.837334 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.837369 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.837378 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.837392 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.837402 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.939349 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.939387 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.939395 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.939409 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:47 crc kubenswrapper[4779]: I0320 15:24:47.939417 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:47Z","lastTransitionTime":"2026-03-20T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.041919 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.041962 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.041973 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.041988 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.041999 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.144047 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.144164 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.144175 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.144189 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.144200 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.242599 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" event={"ID":"e7a20ef3-86de-4db2-b500-63af002500b4","Type":"ContainerStarted","Data":"e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.246270 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.246309 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.246329 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.246344 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.246354 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.255462 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.266048 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.277178 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.287496 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.298463 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.307713 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.316985 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.339075 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.348242 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.348279 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.348290 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.348306 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.348317 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.353806 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.364373 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.375371 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.388510 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.406024 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.424840 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.444406 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:48Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.451252 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.451290 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.451302 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.451318 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.451329 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.553831 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.553873 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.553887 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.553904 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.553915 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.656505 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.656534 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.656541 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.656555 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.656564 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.759030 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.759082 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.759098 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.759166 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.759183 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.808102 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.808206 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.808212 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.808311 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:48 crc kubenswrapper[4779]: E0320 15:24:48.808497 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:48 crc kubenswrapper[4779]: E0320 15:24:48.808610 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:48 crc kubenswrapper[4779]: E0320 15:24:48.808682 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:48 crc kubenswrapper[4779]: E0320 15:24:48.808746 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.861876 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.861919 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.861929 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.861944 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.861955 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.964573 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.964616 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.964625 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.964640 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:48 crc kubenswrapper[4779]: I0320 15:24:48.964650 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:48Z","lastTransitionTime":"2026-03-20T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.067710 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.067751 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.067776 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.067793 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.067805 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.169947 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.170010 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.170023 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.170046 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.170062 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.255900 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/0.log" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.260855 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124" exitCode=1 Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.260979 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.262661 4779 scope.go:117] "RemoveContainer" containerID="fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.271944 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.271982 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.271990 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.272004 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.272015 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.279447 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.293679 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.304750 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.317842 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.331753 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.342916 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.356009 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.367178 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.374500 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.374533 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.374541 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.374553 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.374562 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.378293 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.389489 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.398154 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.414775 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:49Z\\\",\\\"message\\\":\\\"rvice (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077644 6441 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077885 6441 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077920 6441 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077950 6441 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 15:24:49.078186 6441 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:24:49.078775 6441 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:24:49.078800 6441 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:24:49.078855 6441 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:24:49.078879 6441 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:24:49.078930 6441 factory.go:656] Stopping watch factory\\\\nI0320 15:24:49.078965 6441 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.426843 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.437878 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.448376 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.476653 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.476688 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.476698 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.476711 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.476719 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.578988 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.579024 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.579032 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.579046 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.579055 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.681415 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.681460 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.681472 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.681489 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.681501 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.783819 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.783864 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.783875 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.783894 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.783906 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.886283 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.886312 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.886322 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.886334 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:49 crc kubenswrapper[4779]: I0320 15:24:49.886342 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:49Z","lastTransitionTime":"2026-03-20T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.013383 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.013452 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.013462 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.013476 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.013484 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.115503 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.115537 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.115548 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.115562 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.115572 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.218366 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.218411 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.218423 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.218440 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.218452 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.265367 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/1.log" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.266350 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/0.log" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.269711 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b" exitCode=1 Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.269760 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.269804 4779 scope.go:117] "RemoveContainer" containerID="fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.270537 4779 scope.go:117] "RemoveContainer" containerID="7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b" Mar 20 15:24:50 crc kubenswrapper[4779]: E0320 15:24:50.270672 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.283675 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.293844 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.304423 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.320893 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.320950 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.320965 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.320988 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.321001 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.321476 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.331379 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.346684 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.360261 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.371998 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.385680 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.398718 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.410759 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.419167 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.422995 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.423031 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.423042 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.423060 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.423072 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.429319 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.448430 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:49Z\\\",\\\"message\\\":\\\"rvice (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077644 6441 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077885 6441 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077920 6441 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077950 6441 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 15:24:49.078186 6441 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:24:49.078775 6441 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:24:49.078800 6441 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:24:49.078855 6441 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:24:49.078879 6441 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:24:49.078930 6441 factory.go:656] Stopping watch factory\\\\nI0320 15:24:49.078965 6441 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.460599 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.525719 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.525763 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.525771 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.525784 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.525793 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.628919 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.628980 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.628997 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.629021 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.629034 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.732604 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.732649 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.732662 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.732682 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.732697 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.808863 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.808937 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.808898 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.809137 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:50 crc kubenswrapper[4779]: E0320 15:24:50.809179 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:50 crc kubenswrapper[4779]: E0320 15:24:50.809227 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:50 crc kubenswrapper[4779]: E0320 15:24:50.809346 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:50 crc kubenswrapper[4779]: E0320 15:24:50.809431 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.836144 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.836201 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.836217 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.836243 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.836261 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.938891 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.938959 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.938973 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.938995 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:50 crc kubenswrapper[4779]: I0320 15:24:50.939009 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:50Z","lastTransitionTime":"2026-03-20T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.041781 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.041843 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.041857 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.041877 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.041889 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.145184 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.145289 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.145320 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.145355 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.145378 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.248621 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.248721 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.248752 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.248787 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.248813 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.275218 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/1.log" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.351277 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.351373 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.351396 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.351428 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.351449 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.453705 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.453788 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.453810 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.453835 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.453854 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.556037 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.556319 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.556359 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.556389 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.556408 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.658579 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.658616 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.658624 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.658640 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.658651 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.760241 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.760291 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.760303 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.760319 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.760330 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.862575 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.862629 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.862644 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.862665 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.862683 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.965335 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.965372 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.965379 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.965393 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:51 crc kubenswrapper[4779]: I0320 15:24:51.965402 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:51Z","lastTransitionTime":"2026-03-20T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.068403 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.068439 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.068449 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.068477 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.068491 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.171433 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.171486 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.171513 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.171533 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.171545 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.273556 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.273593 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.273602 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.273614 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.273622 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.282940 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.283049 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.289922 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l9j6r" event={"ID":"46120a91-b00c-4299-b552-a374d2a78726","Type":"ContainerStarted","Data":"ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.301367 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.312314 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.325554 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.335429 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.348412 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.358857 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.371133 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.375357 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.375390 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.375400 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.375416 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.375425 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.383884 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.401007 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.416272 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.424472 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.433208 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.449628 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:49Z\\\",\\\"message\\\":\\\"rvice (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077644 6441 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077885 6441 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077920 6441 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077950 6441 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 15:24:49.078186 6441 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:24:49.078775 6441 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:24:49.078800 6441 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:24:49.078855 6441 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:24:49.078879 6441 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:24:49.078930 6441 factory.go:656] Stopping watch factory\\\\nI0320 15:24:49.078965 6441 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.461295 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.472301 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.477257 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.477316 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.477349 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.477375 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.477393 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.483616 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.492890 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.501925 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.517840 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:49Z\\\",\\\"message\\\":\\\"rvice (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077644 6441 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077885 6441 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077920 6441 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077950 6441 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 15:24:49.078186 6441 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:24:49.078775 6441 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:24:49.078800 6441 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:24:49.078855 6441 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:24:49.078879 6441 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:24:49.078930 6441 factory.go:656] Stopping watch factory\\\\nI0320 15:24:49.078965 6441 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.529232 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.540253 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.549984 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.560523 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.570363 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.578628 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.579515 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.579546 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.579555 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.579571 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.579579 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.591304 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.603494 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.616691 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.627288 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.636957 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.682389 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.682453 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.682470 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.682493 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.682510 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.784593 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.784632 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.784641 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.784656 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.784667 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.808551 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.808670 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:52 crc kubenswrapper[4779]: E0320 15:24:52.808825 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.808866 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.808898 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:52 crc kubenswrapper[4779]: E0320 15:24:52.808975 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:52 crc kubenswrapper[4779]: E0320 15:24:52.809071 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:52 crc kubenswrapper[4779]: E0320 15:24:52.809338 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.809518 4779 scope.go:117] "RemoveContainer" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.887128 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.887161 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.887169 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.887182 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.887190 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.989928 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.989973 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.989986 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.990002 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:52 crc kubenswrapper[4779]: I0320 15:24:52.990011 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:52Z","lastTransitionTime":"2026-03-20T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.092423 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.092471 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.092483 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.092500 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.092511 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.195185 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.195244 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.195253 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.195288 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.195299 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.293289 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.295232 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.295650 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.297159 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.297198 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.297210 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.297224 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.297233 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.313552 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.328281 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.339532 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.348548 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.361379 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.372125 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.383239 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.394225 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.399668 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.399712 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.399721 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.399734 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.399746 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.410328 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.421833 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.431011 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.446856 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.463574 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:49Z\\\",\\\"message\\\":\\\"rvice (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077644 6441 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077885 6441 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077920 6441 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077950 6441 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 15:24:49.078186 6441 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:24:49.078775 6441 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:24:49.078800 6441 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:24:49.078855 6441 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:24:49.078879 6441 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:24:49.078930 6441 factory.go:656] Stopping watch factory\\\\nI0320 15:24:49.078965 6441 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.475160 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.484852 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.501641 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.501688 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.501701 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.501720 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.501729 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.603180 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.603218 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.603228 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.603244 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.603254 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.705607 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.705648 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.705656 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.705670 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.705678 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.807845 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.807888 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.807901 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.807916 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.807932 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.818617 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.830358 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.840803 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.852434 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.864191 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.874581 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.884435 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.900049 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa57d2293176d42be5d5a5a870389bb5f89f4b09df2f0bffa259d270542bc124\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:49Z\\\",\\\"message\\\":\\\"rvice (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077644 6441 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077885 6441 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:24:49.077920 6441 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:24:49.077950 6441 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 15:24:49.078186 6441 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:24:49.078775 6441 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:24:49.078800 6441 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:24:49.078855 6441 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:24:49.078879 6441 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:24:49.078930 6441 factory.go:656] Stopping watch factory\\\\nI0320 15:24:49.078965 6441 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.910227 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.910265 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.910276 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.910293 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.910305 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:53Z","lastTransitionTime":"2026-03-20T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.912318 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.922506 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.939761 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.955167 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.968176 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.979359 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:53 crc kubenswrapper[4779]: I0320 15:24:53.992354 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.012926 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.012961 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.012971 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.012985 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.012996 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.115958 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.116000 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.116012 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.116028 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.116043 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.217964 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.218007 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.218015 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.218062 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.218080 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.319694 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.319732 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.319742 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.319756 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.319767 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.421797 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.421836 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.421846 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.421857 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.421866 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.523623 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.523675 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.523689 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.523709 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.523723 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.625640 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.625918 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.625930 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.625947 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.625959 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.640703 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.640744 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.640753 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.640766 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.640776 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.654553 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.657617 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.657639 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.657649 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.657662 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.657673 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.675286 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.679039 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.679137 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.679156 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.679174 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.679187 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.690677 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.694949 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.694991 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.695001 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.695015 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.695026 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.713058 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.719802 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.719863 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.719879 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.719903 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.719918 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.742890 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.743187 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.744571 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.744617 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.744633 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.744652 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.744667 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.807899 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.807941 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.807966 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.807911 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.808008 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.808072 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.808140 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:54 crc kubenswrapper[4779]: E0320 15:24:54.808186 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.846409 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.846435 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.846442 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.846454 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.846464 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.948954 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.948983 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.948991 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.949004 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:54 crc kubenswrapper[4779]: I0320 15:24:54.949013 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:54Z","lastTransitionTime":"2026-03-20T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.051977 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.052018 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.052029 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.052046 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.052058 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.153496 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.154985 4779 scope.go:117] "RemoveContainer" containerID="7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b" Mar 20 15:24:55 crc kubenswrapper[4779]: E0320 15:24:55.155366 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.156549 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.156617 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.156638 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.156665 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.156687 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.180088 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.199702 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.218720 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.235312 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.259660 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.260312 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.260347 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.260361 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.260378 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.260390 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.279975 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.297275 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.319268 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.336256 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.349225 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.361934 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.363590 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.363621 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.363630 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.363643 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.363652 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.382955 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.395963 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.409680 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.430580 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.465915 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.466251 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.466322 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.466466 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.466552 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.569187 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.569209 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.569216 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.569228 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.569237 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.671535 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.672021 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.672135 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.672244 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.672322 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.774594 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.775166 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.775281 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.775394 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.775521 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.878777 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.878832 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.878843 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.878863 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.878875 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.981478 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.981703 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.981796 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.981903 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:55 crc kubenswrapper[4779]: I0320 15:24:55.981982 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:55Z","lastTransitionTime":"2026-03-20T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.084774 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.084805 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.084812 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.084825 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.084833 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.187553 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.187626 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.187687 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.187720 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.187747 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.290222 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.290310 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.290323 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.290337 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.290346 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.391851 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.391927 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.391936 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.391963 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.391972 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.495042 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.495100 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.495139 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.495162 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.495179 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.598130 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.598174 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.598186 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.598204 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.598216 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.702031 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.702082 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.702094 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.702132 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.702151 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.719901 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720100 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:25:28.720066363 +0000 UTC m=+145.682582163 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.720193 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.720253 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.720317 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720419 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720460 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720486 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720510 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:25:28.720495894 +0000 UTC m=+145.683011714 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720512 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720533 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720543 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:25:28.720519925 +0000 UTC m=+145.683035925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.720570 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:25:28.720560126 +0000 UTC m=+145.683076156 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.806545 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.806618 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.806631 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.806651 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.806667 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.807841 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.807913 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.807944 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.808220 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.808276 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.808359 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.808507 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.808690 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.821745 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.821814 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.821972 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.822027 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.822047 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.822126 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:25:28.822087499 +0000 UTC m=+145.784603319 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.822200 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: E0320 15:24:56.822230 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:25:28.822221272 +0000 UTC m=+145.784737092 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.909783 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.909823 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.909836 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.909851 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:56 crc kubenswrapper[4779]: I0320 15:24:56.909861 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:56Z","lastTransitionTime":"2026-03-20T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.012925 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.012974 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.012986 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.013004 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.013017 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.115907 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.115953 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.115963 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.115979 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.115989 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.218133 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.218171 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.218179 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.218194 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.218203 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.320405 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.320475 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.320493 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.320520 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.320539 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.422831 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.422877 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.422887 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.422901 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.422912 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.525439 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.525484 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.525500 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.525520 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.525532 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.629132 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.629192 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.629206 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.629225 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.629242 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.731905 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.731983 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.732001 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.732024 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.732040 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.824183 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.835053 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.835099 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.835133 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.835151 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.835165 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.937959 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.938011 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.938028 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.938052 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:57 crc kubenswrapper[4779]: I0320 15:24:57.938069 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:57Z","lastTransitionTime":"2026-03-20T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.040973 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.041031 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.041047 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.041067 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.041083 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.143328 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.143367 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.143378 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.143395 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.143405 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.245736 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.245794 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.245811 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.245834 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.245923 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.348226 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.348264 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.348275 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.348288 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.348298 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.451229 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.451258 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.451265 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.451279 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.451288 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.553124 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.553150 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.553158 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.553170 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.553178 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.655822 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.655873 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.655892 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.655921 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.655943 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.758010 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.758043 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.758054 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.758070 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.758084 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.808027 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:24:58 crc kubenswrapper[4779]: E0320 15:24:58.808170 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.808424 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.808510 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:24:58 crc kubenswrapper[4779]: E0320 15:24:58.808629 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.808695 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:24:58 crc kubenswrapper[4779]: E0320 15:24:58.808776 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:24:58 crc kubenswrapper[4779]: E0320 15:24:58.808861 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.860820 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.860865 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.860881 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.860903 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.860919 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.963535 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.963591 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.963611 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.963638 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:58 crc kubenswrapper[4779]: I0320 15:24:58.963663 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:58Z","lastTransitionTime":"2026-03-20T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.066851 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.066883 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.066891 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.066905 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.066913 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.169715 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.169800 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.169829 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.169862 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.169880 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.272820 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.272874 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.272887 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.272905 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.272920 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.375274 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.375324 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.375335 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.375354 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.375366 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.478588 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.478656 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.478678 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.478743 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.478774 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.581378 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.581664 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.581743 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.581808 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.581864 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.684911 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.684947 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.684958 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.684975 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.684986 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.787585 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.787624 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.787632 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.787645 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.787656 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.889782 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.889832 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.889847 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.889867 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.889883 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.993062 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.993156 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.993175 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.993198 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:24:59 crc kubenswrapper[4779]: I0320 15:24:59.993215 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:24:59Z","lastTransitionTime":"2026-03-20T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.095910 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.096314 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.096565 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.096820 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.097055 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.200047 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.200198 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.200218 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.200248 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.200271 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.306211 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.306329 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.306376 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.306392 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.306407 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.408955 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.408997 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.409008 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.409027 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.409040 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.511060 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.511092 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.511099 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.511132 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.511141 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.614179 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.614248 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.614264 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.614287 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.614306 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.716955 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.716986 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.716994 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.717008 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.717016 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.808278 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.808407 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.808485 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:00 crc kubenswrapper[4779]: E0320 15:25:00.808431 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.808396 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:00 crc kubenswrapper[4779]: E0320 15:25:00.808901 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:00 crc kubenswrapper[4779]: E0320 15:25:00.809035 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:00 crc kubenswrapper[4779]: E0320 15:25:00.809198 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.819138 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.819183 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.819196 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.819213 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.819226 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.921322 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.921380 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.921392 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.921408 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:00 crc kubenswrapper[4779]: I0320 15:25:00.921419 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:00Z","lastTransitionTime":"2026-03-20T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.024479 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.024519 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.024531 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.024548 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.024559 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.128090 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.128158 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.128170 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.128188 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.128199 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.231471 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.231544 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.231558 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.231577 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.231588 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.335346 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.335950 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.335972 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.335997 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.336016 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.438863 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.438933 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.438953 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.438983 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.439004 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.541646 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.541715 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.541733 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.541757 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.541775 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.644602 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.644666 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.644693 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.644726 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.644756 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.747963 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.748030 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.748048 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.748073 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.748090 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.851502 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.851549 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.851607 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.851631 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.851646 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.954730 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.954778 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.954788 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.954804 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:01 crc kubenswrapper[4779]: I0320 15:25:01.954833 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:01Z","lastTransitionTime":"2026-03-20T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.056922 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.056985 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.057003 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.057028 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.057046 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.160430 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.160468 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.160476 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.160491 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.160536 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.263968 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.264042 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.264064 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.264094 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.264147 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.367009 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.367075 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.367092 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.367156 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.367175 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.469694 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.469725 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.469735 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.469750 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.469762 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.571641 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.571687 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.571698 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.571715 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.571729 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.675254 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.675299 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.675310 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.675327 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.675338 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.777644 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.777705 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.777727 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.777755 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.777775 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.808416 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.808492 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.808555 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.808446 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:02 crc kubenswrapper[4779]: E0320 15:25:02.808650 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:02 crc kubenswrapper[4779]: E0320 15:25:02.808845 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:02 crc kubenswrapper[4779]: E0320 15:25:02.809011 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:02 crc kubenswrapper[4779]: E0320 15:25:02.809207 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.880597 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.880656 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.880676 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.880708 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:02 crc kubenswrapper[4779]: I0320 15:25:02.880725 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:02.984075 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:02.984165 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:02.984182 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:02.984211 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:02.984232 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:02Z","lastTransitionTime":"2026-03-20T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.087416 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.087487 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.087505 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.087529 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.087548 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:03Z","lastTransitionTime":"2026-03-20T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.190027 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.190096 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.190152 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.190180 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.190197 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:03Z","lastTransitionTime":"2026-03-20T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.292576 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.292645 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.292656 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.292675 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.292686 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:03Z","lastTransitionTime":"2026-03-20T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.395321 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.395385 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.395404 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.395433 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.395452 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:03Z","lastTransitionTime":"2026-03-20T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.498377 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.498448 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.498473 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.498501 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.498522 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:03Z","lastTransitionTime":"2026-03-20T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.600900 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.600973 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.600996 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.601021 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.601040 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:03Z","lastTransitionTime":"2026-03-20T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.703685 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.703720 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.703729 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.703744 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.703753 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:03Z","lastTransitionTime":"2026-03-20T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:03 crc kubenswrapper[4779]: E0320 15:25:03.804443 4779 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.826539 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.841724 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.852593 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.863687 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.883588 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.897620 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.911378 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.926164 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: E0320 15:25:03.929906 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.944418 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.967012 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.980359 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:03 crc kubenswrapper[4779]: I0320 15:25:03.992407 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.009194 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.021933 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.032723 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.045211 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.787742 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.787770 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.787779 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.787794 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.787803 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:04Z","lastTransitionTime":"2026-03-20T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.799892 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.804582 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.804645 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.804657 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.804674 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.804687 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:04Z","lastTransitionTime":"2026-03-20T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.807732 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.807763 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.807823 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.807733 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.807984 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.808021 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.807746 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.808576 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.817416 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.820649 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.820682 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.820707 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.820722 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.820730 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:04Z","lastTransitionTime":"2026-03-20T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.833783 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.836963 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.837005 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.837017 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.837030 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.837039 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:04Z","lastTransitionTime":"2026-03-20T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.848979 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.852475 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.852603 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.852712 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.852806 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:04 crc kubenswrapper[4779]: I0320 15:25:04.852892 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:04Z","lastTransitionTime":"2026-03-20T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.865289 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:04 crc kubenswrapper[4779]: E0320 15:25:04.865441 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:25:05 crc kubenswrapper[4779]: I0320 15:25:05.819620 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 15:25:06 crc kubenswrapper[4779]: I0320 15:25:06.807691 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:06 crc kubenswrapper[4779]: I0320 15:25:06.807743 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:06 crc kubenswrapper[4779]: E0320 15:25:06.807804 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:06 crc kubenswrapper[4779]: E0320 15:25:06.807910 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:06 crc kubenswrapper[4779]: I0320 15:25:06.807964 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:06 crc kubenswrapper[4779]: E0320 15:25:06.808043 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:06 crc kubenswrapper[4779]: I0320 15:25:06.808191 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:06 crc kubenswrapper[4779]: E0320 15:25:06.808322 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:08 crc kubenswrapper[4779]: I0320 15:25:08.808608 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:08 crc kubenswrapper[4779]: I0320 15:25:08.809210 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:08 crc kubenswrapper[4779]: I0320 15:25:08.809212 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:08 crc kubenswrapper[4779]: I0320 15:25:08.809243 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:08 crc kubenswrapper[4779]: E0320 15:25:08.809333 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:08 crc kubenswrapper[4779]: E0320 15:25:08.809516 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:08 crc kubenswrapper[4779]: E0320 15:25:08.809508 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:08 crc kubenswrapper[4779]: E0320 15:25:08.809599 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:08 crc kubenswrapper[4779]: I0320 15:25:08.809761 4779 scope.go:117] "RemoveContainer" containerID="7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b" Mar 20 15:25:08 crc kubenswrapper[4779]: E0320 15:25:08.930689 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.345950 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/1.log" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.348473 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d"} Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.348934 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.360375 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.370878 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.384571 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.395752 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.405808 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.418043 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.428233 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.439814 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.449166 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.461669 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.470722 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.481993 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.490463 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.498297 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.513228 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.522203 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.533852 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.850677 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.862908 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.873532 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.884221 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.893868 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.907210 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.921550 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.932591 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.943183 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.953872 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.964144 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.972708 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.981882 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:09 crc kubenswrapper[4779]: I0320 15:25:09.992836 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.009816 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.019983 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.031796 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.041954 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.355272 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/2.log" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.356129 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/1.log" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.359985 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d" exitCode=1 Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.360028 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d"} Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.360065 4779 scope.go:117] "RemoveContainer" containerID="7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.365504 4779 scope.go:117] "RemoveContainer" containerID="d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d" Mar 20 15:25:10 crc kubenswrapper[4779]: E0320 15:25:10.365980 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.373327 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.388330 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.406344 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.421675 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.435079 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.446606 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.457742 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.467301 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.476262 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.484133 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.498229 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f5873d0407b18cdb2c0b462ebadd12ef446c2a9024651a80f5c27176152bc6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:24:50Z\\\",\\\"message\\\":\\\"Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0002c976f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.508203 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.519399 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.529747 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.539339 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.549931 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.558715 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:10Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.808690 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.808720 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:10 crc kubenswrapper[4779]: E0320 15:25:10.808816 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.808690 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:10 crc kubenswrapper[4779]: E0320 15:25:10.808912 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:10 crc kubenswrapper[4779]: I0320 15:25:10.809053 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:10 crc kubenswrapper[4779]: E0320 15:25:10.809228 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:10 crc kubenswrapper[4779]: E0320 15:25:10.809453 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.364862 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/2.log" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.368414 4779 scope.go:117] "RemoveContainer" containerID="d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d" Mar 20 15:25:11 crc kubenswrapper[4779]: E0320 15:25:11.368559 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.387514 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.399238 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.408838 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.419922 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.436906 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.449895 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.464222 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.477059 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.488390 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.501343 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.512399 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.523238 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.537880 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.550826 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.566550 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.579280 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:11 crc kubenswrapper[4779]: I0320 15:25:11.590876 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:11Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:12 crc kubenswrapper[4779]: I0320 15:25:12.808068 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:12 crc kubenswrapper[4779]: I0320 15:25:12.808137 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:12 crc kubenswrapper[4779]: E0320 15:25:12.808208 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:12 crc kubenswrapper[4779]: I0320 15:25:12.808244 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:12 crc kubenswrapper[4779]: E0320 15:25:12.808357 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:12 crc kubenswrapper[4779]: E0320 15:25:12.808415 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:12 crc kubenswrapper[4779]: I0320 15:25:12.808422 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:12 crc kubenswrapper[4779]: E0320 15:25:12.808682 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.821306 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.834260 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.847765 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.857886 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.866423 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.881868 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.895266 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.905721 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.916629 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.928468 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: E0320 15:25:13.931042 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.945643 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.958195 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.968215 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.981763 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:13 crc kubenswrapper[4779]: I0320 15:25:13.992016 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:14 crc kubenswrapper[4779]: I0320 15:25:14.001497 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:14 crc kubenswrapper[4779]: I0320 15:25:14.010468 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:14 crc kubenswrapper[4779]: I0320 15:25:14.808452 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:14 crc kubenswrapper[4779]: I0320 15:25:14.808485 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:14 crc kubenswrapper[4779]: I0320 15:25:14.808499 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:14 crc kubenswrapper[4779]: E0320 15:25:14.808581 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:14 crc kubenswrapper[4779]: I0320 15:25:14.808680 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:14 crc kubenswrapper[4779]: E0320 15:25:14.808768 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:14 crc kubenswrapper[4779]: E0320 15:25:14.808820 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:14 crc kubenswrapper[4779]: E0320 15:25:14.808905 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.042989 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.043022 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.043030 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.043041 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.043051 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:15Z","lastTransitionTime":"2026-03-20T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:15 crc kubenswrapper[4779]: E0320 15:25:15.053878 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.057492 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.057571 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.057590 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.057615 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.057633 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:15Z","lastTransitionTime":"2026-03-20T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:15 crc kubenswrapper[4779]: E0320 15:25:15.070564 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.074176 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.074209 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.074218 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.074232 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.074240 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:15Z","lastTransitionTime":"2026-03-20T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:15 crc kubenswrapper[4779]: E0320 15:25:15.086579 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.090894 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.090931 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.090943 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.090961 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.090972 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:15Z","lastTransitionTime":"2026-03-20T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:15 crc kubenswrapper[4779]: E0320 15:25:15.102469 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.105771 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.105809 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.105818 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.105831 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:15 crc kubenswrapper[4779]: I0320 15:25:15.105842 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:15Z","lastTransitionTime":"2026-03-20T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:15 crc kubenswrapper[4779]: E0320 15:25:15.118029 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:15 crc kubenswrapper[4779]: E0320 15:25:15.118220 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:25:16 crc kubenswrapper[4779]: I0320 15:25:16.808046 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:16 crc kubenswrapper[4779]: I0320 15:25:16.808070 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:16 crc kubenswrapper[4779]: E0320 15:25:16.809037 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:16 crc kubenswrapper[4779]: I0320 15:25:16.808157 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:16 crc kubenswrapper[4779]: E0320 15:25:16.809126 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:16 crc kubenswrapper[4779]: I0320 15:25:16.808089 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:16 crc kubenswrapper[4779]: E0320 15:25:16.809775 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:16 crc kubenswrapper[4779]: E0320 15:25:16.809409 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:18 crc kubenswrapper[4779]: I0320 15:25:18.808818 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:18 crc kubenswrapper[4779]: I0320 15:25:18.808892 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:18 crc kubenswrapper[4779]: I0320 15:25:18.808854 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:18 crc kubenswrapper[4779]: E0320 15:25:18.809087 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:18 crc kubenswrapper[4779]: E0320 15:25:18.809179 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:18 crc kubenswrapper[4779]: E0320 15:25:18.809260 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:18 crc kubenswrapper[4779]: I0320 15:25:18.809653 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:18 crc kubenswrapper[4779]: E0320 15:25:18.809851 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:18 crc kubenswrapper[4779]: E0320 15:25:18.932604 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:20 crc kubenswrapper[4779]: I0320 15:25:20.807824 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:20 crc kubenswrapper[4779]: E0320 15:25:20.808065 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:20 crc kubenswrapper[4779]: I0320 15:25:20.808212 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:20 crc kubenswrapper[4779]: I0320 15:25:20.808268 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:20 crc kubenswrapper[4779]: E0320 15:25:20.808408 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:20 crc kubenswrapper[4779]: I0320 15:25:20.808514 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:20 crc kubenswrapper[4779]: E0320 15:25:20.808542 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:20 crc kubenswrapper[4779]: E0320 15:25:20.808633 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:21 crc kubenswrapper[4779]: I0320 15:25:21.902884 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:21 crc kubenswrapper[4779]: I0320 15:25:21.903031 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:21 crc kubenswrapper[4779]: E0320 15:25:21.903093 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:21 crc kubenswrapper[4779]: E0320 15:25:21.915248 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:22 crc kubenswrapper[4779]: I0320 15:25:22.808065 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:22 crc kubenswrapper[4779]: I0320 15:25:22.808244 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:22 crc kubenswrapper[4779]: E0320 15:25:22.808273 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:22 crc kubenswrapper[4779]: E0320 15:25:22.808465 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.808762 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.809178 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:23 crc kubenswrapper[4779]: E0320 15:25:23.809289 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:23 crc kubenswrapper[4779]: E0320 15:25:23.809513 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.809594 4779 scope.go:117] "RemoveContainer" containerID="d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d" Mar 20 15:25:23 crc kubenswrapper[4779]: E0320 15:25:23.809795 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.823582 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.839677 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.854745 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.885716 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.904151 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.922726 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:23 crc kubenswrapper[4779]: E0320 15:25:23.933299 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.945628 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.963513 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:23 crc kubenswrapper[4779]: I0320 15:25:23.979930 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:23Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.010590 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:24Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.034096 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:24Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.061827 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:24Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.076165 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:24Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.088106 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:24Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.098517 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:24Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.107488 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:24Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.116933 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:24Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.808534 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:24 crc kubenswrapper[4779]: I0320 15:25:24.808613 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:24 crc kubenswrapper[4779]: E0320 15:25:24.808724 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:24 crc kubenswrapper[4779]: E0320 15:25:24.808845 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.309591 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.309641 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.309649 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.309662 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.309696 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:25Z","lastTransitionTime":"2026-03-20T15:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:25 crc kubenswrapper[4779]: E0320 15:25:25.320602 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.323800 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.323846 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.323856 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.323871 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.323881 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:25Z","lastTransitionTime":"2026-03-20T15:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:25 crc kubenswrapper[4779]: E0320 15:25:25.334871 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.337790 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.337835 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.337845 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.337860 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.337869 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:25Z","lastTransitionTime":"2026-03-20T15:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:25 crc kubenswrapper[4779]: E0320 15:25:25.348824 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.351811 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.351849 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.351861 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.351876 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.351884 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:25Z","lastTransitionTime":"2026-03-20T15:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:25 crc kubenswrapper[4779]: E0320 15:25:25.362913 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.365514 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.365552 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.365563 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.365578 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.365589 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:25Z","lastTransitionTime":"2026-03-20T15:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:25 crc kubenswrapper[4779]: E0320 15:25:25.376234 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:25 crc kubenswrapper[4779]: E0320 15:25:25.376347 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.808668 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:25 crc kubenswrapper[4779]: I0320 15:25:25.808713 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:25 crc kubenswrapper[4779]: E0320 15:25:25.808813 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:25 crc kubenswrapper[4779]: E0320 15:25:25.808916 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:26 crc kubenswrapper[4779]: I0320 15:25:26.808288 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:26 crc kubenswrapper[4779]: I0320 15:25:26.808422 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:26 crc kubenswrapper[4779]: E0320 15:25:26.808543 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:26 crc kubenswrapper[4779]: E0320 15:25:26.808820 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:26 crc kubenswrapper[4779]: I0320 15:25:26.818859 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 15:25:27 crc kubenswrapper[4779]: I0320 15:25:27.808385 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:27 crc kubenswrapper[4779]: I0320 15:25:27.808418 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:27 crc kubenswrapper[4779]: E0320 15:25:27.808633 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:27 crc kubenswrapper[4779]: E0320 15:25:27.808742 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:28 crc kubenswrapper[4779]: I0320 15:25:28.808954 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:28 crc kubenswrapper[4779]: I0320 15:25:28.809059 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.809288 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.809450 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:28 crc kubenswrapper[4779]: I0320 15:25:28.816468 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:25:28 crc kubenswrapper[4779]: I0320 15:25:28.816709 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.816753 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:32.816707701 +0000 UTC m=+209.779223551 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:25:28 crc kubenswrapper[4779]: I0320 15:25:28.816813 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.816906 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: I0320 15:25:28.816925 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.816997 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:26:32.816971368 +0000 UTC m=+209.779487208 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.817092 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.817149 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.817269 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:26:32.817245907 +0000 UTC m=+209.779761737 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.817177 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.817325 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.817421 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:26:32.817389171 +0000 UTC m=+209.779905171 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:25:28 crc kubenswrapper[4779]: I0320 15:25:28.917620 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:28 crc kubenswrapper[4779]: I0320 15:25:28.917690 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.917843 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.917864 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.917876 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.917873 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.917931 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:26:32.91791731 +0000 UTC m=+209.880433110 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.918011 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:26:32.917976822 +0000 UTC m=+209.880492662 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:25:28 crc kubenswrapper[4779]: E0320 15:25:28.935522 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:29 crc kubenswrapper[4779]: I0320 15:25:29.808147 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:29 crc kubenswrapper[4779]: I0320 15:25:29.808274 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:29 crc kubenswrapper[4779]: E0320 15:25:29.808346 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:29 crc kubenswrapper[4779]: E0320 15:25:29.808430 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.446223 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/0.log" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.446296 4779 generic.go:334] "Generic (PLEG): container finished" podID="c30ee189-9db1-41af-8a55-29955cbf6712" containerID="e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6" exitCode=1 Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.446343 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfj25" event={"ID":"c30ee189-9db1-41af-8a55-29955cbf6712","Type":"ContainerDied","Data":"e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6"} Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.446949 4779 scope.go:117] "RemoveContainer" containerID="e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.459312 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.470026 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.485846 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.497435 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.512862 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.525620 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.535982 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.545095 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.568123 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.580623 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.595461 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.609477 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.621339 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.636070 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.650026 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.663013 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.673455 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.687986 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:30Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.808745 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:30 crc kubenswrapper[4779]: I0320 15:25:30.808763 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:30 crc kubenswrapper[4779]: E0320 15:25:30.809340 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:30 crc kubenswrapper[4779]: E0320 15:25:30.809552 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.450869 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/0.log" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.450921 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfj25" event={"ID":"c30ee189-9db1-41af-8a55-29955cbf6712","Type":"ContainerStarted","Data":"d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc"} Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.464293 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.477049 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.490405 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.503462 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.516695 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.528959 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.541571 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.552513 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.567047 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.578049 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.593087 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.616306 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.630095 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.641612 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.655265 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.667481 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.680521 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.690311 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:31Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.808528 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:31 crc kubenswrapper[4779]: I0320 15:25:31.808611 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:31 crc kubenswrapper[4779]: E0320 15:25:31.808652 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:31 crc kubenswrapper[4779]: E0320 15:25:31.808728 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:32 crc kubenswrapper[4779]: I0320 15:25:32.808685 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:32 crc kubenswrapper[4779]: E0320 15:25:32.808887 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:32 crc kubenswrapper[4779]: I0320 15:25:32.808691 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:32 crc kubenswrapper[4779]: E0320 15:25:32.809094 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.808439 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.808486 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:33 crc kubenswrapper[4779]: E0320 15:25:33.808579 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:33 crc kubenswrapper[4779]: E0320 15:25:33.808689 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.820877 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.833399 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.847531 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.858268 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.865994 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.876019 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.890765 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.901678 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.912414 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.923675 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.935808 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: E0320 15:25:33.936273 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.947344 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.957762 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.966893 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.979953 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:33 crc kubenswrapper[4779]: I0320 15:25:33.990588 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:34 crc kubenswrapper[4779]: I0320 15:25:34.000380 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:34 crc kubenswrapper[4779]: I0320 15:25:34.008671 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:34 crc kubenswrapper[4779]: I0320 15:25:34.808233 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:34 crc kubenswrapper[4779]: I0320 15:25:34.808225 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:34 crc kubenswrapper[4779]: E0320 15:25:34.808414 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:34 crc kubenswrapper[4779]: E0320 15:25:34.808466 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:34 crc kubenswrapper[4779]: I0320 15:25:34.822033 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.382145 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.382232 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.382255 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.382341 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.382368 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:35Z","lastTransitionTime":"2026-03-20T15:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:35 crc kubenswrapper[4779]: E0320 15:25:35.397669 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.402287 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.402427 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.402453 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.402493 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.402522 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:35Z","lastTransitionTime":"2026-03-20T15:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:35 crc kubenswrapper[4779]: E0320 15:25:35.418671 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.423289 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.423355 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.423369 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.423389 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.423401 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:35Z","lastTransitionTime":"2026-03-20T15:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:35 crc kubenswrapper[4779]: E0320 15:25:35.438851 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.443396 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.443461 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.443474 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.443491 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.443504 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:35Z","lastTransitionTime":"2026-03-20T15:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:35 crc kubenswrapper[4779]: E0320 15:25:35.456655 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.461588 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.461631 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.461641 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.461654 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.461663 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:35Z","lastTransitionTime":"2026-03-20T15:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:35 crc kubenswrapper[4779]: E0320 15:25:35.476177 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:35 crc kubenswrapper[4779]: E0320 15:25:35.476296 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.807855 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:35 crc kubenswrapper[4779]: I0320 15:25:35.808034 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:35 crc kubenswrapper[4779]: E0320 15:25:35.808068 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:35 crc kubenswrapper[4779]: E0320 15:25:35.808385 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:36 crc kubenswrapper[4779]: I0320 15:25:36.808824 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:36 crc kubenswrapper[4779]: I0320 15:25:36.808930 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:36 crc kubenswrapper[4779]: E0320 15:25:36.809024 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:36 crc kubenswrapper[4779]: E0320 15:25:36.809141 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:37 crc kubenswrapper[4779]: I0320 15:25:37.809009 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:37 crc kubenswrapper[4779]: E0320 15:25:37.809186 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:37 crc kubenswrapper[4779]: I0320 15:25:37.810777 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:37 crc kubenswrapper[4779]: E0320 15:25:37.811280 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:38 crc kubenswrapper[4779]: I0320 15:25:38.808015 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:38 crc kubenswrapper[4779]: I0320 15:25:38.808160 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:38 crc kubenswrapper[4779]: E0320 15:25:38.808163 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:38 crc kubenswrapper[4779]: E0320 15:25:38.808259 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:38 crc kubenswrapper[4779]: I0320 15:25:38.808841 4779 scope.go:117] "RemoveContainer" containerID="d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d" Mar 20 15:25:38 crc kubenswrapper[4779]: E0320 15:25:38.937536 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.475296 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/2.log" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.478047 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.478750 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.491875 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.501216 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.516356 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.542442 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.555733 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.568770 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.581506 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.594306 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.605416 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.614135 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.627682 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.638180 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.649612 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.665549 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01b09be4-9da4-4003-9aea-0804f2e4f121\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc1e37affb67c7652f578e79f104773967044884346ea2159f843bc27b53a324\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a7bca00cbf9f2a1abbbcb015536c61a7fe6f26b37d289e182df0b2dd58d72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24af932f21b1dbc1a2d96edd5cb5256eb0b4ddb672a001581f519bbb71b3f528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9392d565f3b41698f93b7dab52770b90590d7a010c0f29a89c91bcdc16131803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee9526f89756d5aadafe6ce6c208353f3a24aa7126fd4e94c7154b5dee0b14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.677928 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.690300 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.702340 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.713008 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.721999 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:39Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.808856 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:39 crc kubenswrapper[4779]: I0320 15:25:39.808916 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:39 crc kubenswrapper[4779]: E0320 15:25:39.808986 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:39 crc kubenswrapper[4779]: E0320 15:25:39.809129 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.482733 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/3.log" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.483376 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/2.log" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.485729 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" exitCode=1 Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.485777 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.485853 4779 scope.go:117] "RemoveContainer" containerID="d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.486413 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:25:40 crc kubenswrapper[4779]: E0320 15:25:40.486594 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.502203 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.513856 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.522968 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.531704 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.542002 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.553386 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.566038 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.574892 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.583155 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.598345 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a5c4ceb45eb88fe057b951ed6674c8fc077cb22b8543936361d80dc7307d9d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"message\\\":\\\"nil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 15:25:09.597241 7038 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:09.597415 7038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:09Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:39Z\\\",\\\"message\\\":\\\"node crc\\\\nI0320 15:25:39.674068 7380 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-bl2w2 after 0 failed attempt(s)\\\\nI0320 15:25:39.674074 7380 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-bl2w2\\\\nI0320 15:25:39.673052 7380 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:25:39.674099 7380 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 15:25:39.674134 7380 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:39.674193 7380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.608724 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.618318 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.633813 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01b09be4-9da4-4003-9aea-0804f2e4f121\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc1e37affb67c7652f578e79f104773967044884346ea2159f843bc27b53a324\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a7bca00cbf9f2a1abbbcb015536c61a7fe6f26b37d289e182df0b2dd58d72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24af932f21b1dbc1a2d96edd5cb5256eb0b4ddb672a001581f519bbb71b3f528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9392d565f3b41698f93b7dab52770b90590d7a010c0f29a89c91bcdc16131803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee9526f89756d5aadafe6ce6c208353f3a24aa7126fd4e94c7154b5dee0b14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.644337 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.653614 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.663869 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.672577 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.680161 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.691627 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.808530 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:40 crc kubenswrapper[4779]: I0320 15:25:40.808588 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:40 crc kubenswrapper[4779]: E0320 15:25:40.808670 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:40 crc kubenswrapper[4779]: E0320 15:25:40.808729 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.490567 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/3.log" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.493546 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:25:41 crc kubenswrapper[4779]: E0320 15:25:41.493707 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.503052 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.513913 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.527856 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.540153 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.550812 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.560795 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.579902 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:39Z\\\",\\\"message\\\":\\\"node crc\\\\nI0320 15:25:39.674068 7380 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-bl2w2 after 0 failed attempt(s)\\\\nI0320 15:25:39.674074 7380 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-bl2w2\\\\nI0320 15:25:39.673052 7380 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:25:39.674099 7380 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 15:25:39.674134 7380 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:39.674193 7380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.591955 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.601592 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.618489 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01b09be4-9da4-4003-9aea-0804f2e4f121\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc1e37affb67c7652f578e79f104773967044884346ea2159f843bc27b53a324\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a7bca00cbf9f2a1abbbcb015536c61a7fe6f26b37d289e182df0b2dd58d72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24af932f21b1dbc1a2d96edd5cb5256eb0b4ddb672a001581f519bbb71b3f528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9392d565f3b41698f93b7dab52770b90590d7a010c0f29a89c91bcdc16131803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee9526f89756d5aadafe6ce6c208353f3a24aa7126fd4e94c7154b5dee0b14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.630017 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.640646 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.651083 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.663095 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.671523 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.685758 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.697919 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.708205 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.724956 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.808205 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:41 crc kubenswrapper[4779]: I0320 15:25:41.808301 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:41 crc kubenswrapper[4779]: E0320 15:25:41.808349 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:41 crc kubenswrapper[4779]: E0320 15:25:41.808628 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:42 crc kubenswrapper[4779]: I0320 15:25:42.808426 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:42 crc kubenswrapper[4779]: E0320 15:25:42.808601 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:42 crc kubenswrapper[4779]: I0320 15:25:42.808441 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:42 crc kubenswrapper[4779]: E0320 15:25:42.809066 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.807849 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:43 crc kubenswrapper[4779]: E0320 15:25:43.807985 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.808028 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:43 crc kubenswrapper[4779]: E0320 15:25:43.808074 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.830476 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01b09be4-9da4-4003-9aea-0804f2e4f121\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc1e37affb67c7652f578e79f104773967044884346ea2159f843bc27b53a324\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a7bca00cbf9f2a1abbbcb015536c61a7fe6f26b37d289e182df0b2dd58d72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24af932f21b1dbc1a2d96edd5cb5256eb0b4ddb672a001581f519bbb71b3f528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9392d565f3b41698f93b7dab52770b90590d7a010c0f29a89c91bcdc16131803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee9526f89756d5aadafe6ce6c208353f3a24aa7126fd4e94c7154b5dee0b14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.841411 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.852181 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.862773 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.872917 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.880957 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.891868 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.901207 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.910333 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.919814 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.928175 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: E0320 15:25:43.938224 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.939260 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.951418 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.962527 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.972665 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.980487 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:43 crc kubenswrapper[4779]: I0320 15:25:43.997068 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:39Z\\\",\\\"message\\\":\\\"node crc\\\\nI0320 15:25:39.674068 7380 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-bl2w2 after 0 failed attempt(s)\\\\nI0320 15:25:39.674074 7380 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-bl2w2\\\\nI0320 15:25:39.673052 7380 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:25:39.674099 7380 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 15:25:39.674134 7380 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:39.674193 7380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:44 crc kubenswrapper[4779]: I0320 15:25:44.012228 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:44 crc kubenswrapper[4779]: I0320 15:25:44.027668 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:44 crc kubenswrapper[4779]: I0320 15:25:44.807961 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:44 crc kubenswrapper[4779]: I0320 15:25:44.808009 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:44 crc kubenswrapper[4779]: E0320 15:25:44.808279 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:44 crc kubenswrapper[4779]: E0320 15:25:44.808361 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.647026 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.647060 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.647070 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.647086 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.647095 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:45Z","lastTransitionTime":"2026-03-20T15:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:45 crc kubenswrapper[4779]: E0320 15:25:45.660506 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.663598 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.663657 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.663680 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.663707 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.663729 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:45Z","lastTransitionTime":"2026-03-20T15:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:45 crc kubenswrapper[4779]: E0320 15:25:45.676429 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.680688 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.680778 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.680817 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.680838 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.680848 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:45Z","lastTransitionTime":"2026-03-20T15:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:45 crc kubenswrapper[4779]: E0320 15:25:45.693519 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.696763 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.696794 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.696805 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.696821 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.696832 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:45Z","lastTransitionTime":"2026-03-20T15:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:45 crc kubenswrapper[4779]: E0320 15:25:45.708792 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.712354 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.712392 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.712410 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.712423 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.712433 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:45Z","lastTransitionTime":"2026-03-20T15:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:45 crc kubenswrapper[4779]: E0320 15:25:45.723902 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:45 crc kubenswrapper[4779]: E0320 15:25:45.724046 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.808078 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:45 crc kubenswrapper[4779]: I0320 15:25:45.808172 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:45 crc kubenswrapper[4779]: E0320 15:25:45.808345 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:45 crc kubenswrapper[4779]: E0320 15:25:45.808485 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:46 crc kubenswrapper[4779]: I0320 15:25:46.808596 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:46 crc kubenswrapper[4779]: I0320 15:25:46.808682 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:46 crc kubenswrapper[4779]: E0320 15:25:46.808735 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:46 crc kubenswrapper[4779]: E0320 15:25:46.808816 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:47 crc kubenswrapper[4779]: I0320 15:25:47.808796 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:47 crc kubenswrapper[4779]: I0320 15:25:47.808867 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:47 crc kubenswrapper[4779]: E0320 15:25:47.808954 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:47 crc kubenswrapper[4779]: E0320 15:25:47.809323 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:48 crc kubenswrapper[4779]: I0320 15:25:48.808293 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:48 crc kubenswrapper[4779]: I0320 15:25:48.808380 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:48 crc kubenswrapper[4779]: E0320 15:25:48.808420 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:48 crc kubenswrapper[4779]: E0320 15:25:48.808506 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:48 crc kubenswrapper[4779]: E0320 15:25:48.939210 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:49 crc kubenswrapper[4779]: I0320 15:25:49.808766 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:49 crc kubenswrapper[4779]: E0320 15:25:49.808910 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:49 crc kubenswrapper[4779]: I0320 15:25:49.809457 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:49 crc kubenswrapper[4779]: E0320 15:25:49.809683 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:50 crc kubenswrapper[4779]: I0320 15:25:50.807718 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:50 crc kubenswrapper[4779]: E0320 15:25:50.808204 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:50 crc kubenswrapper[4779]: I0320 15:25:50.808052 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:50 crc kubenswrapper[4779]: E0320 15:25:50.808653 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:51 crc kubenswrapper[4779]: I0320 15:25:51.808122 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:51 crc kubenswrapper[4779]: I0320 15:25:51.808140 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:51 crc kubenswrapper[4779]: E0320 15:25:51.808343 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:51 crc kubenswrapper[4779]: E0320 15:25:51.808260 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:52 crc kubenswrapper[4779]: I0320 15:25:52.808466 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:52 crc kubenswrapper[4779]: E0320 15:25:52.808940 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:52 crc kubenswrapper[4779]: I0320 15:25:52.809019 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:52 crc kubenswrapper[4779]: E0320 15:25:52.809392 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.808521 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.808553 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:53 crc kubenswrapper[4779]: E0320 15:25:53.809387 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:53 crc kubenswrapper[4779]: E0320 15:25:53.809550 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.827365 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01b09be4-9da4-4003-9aea-0804f2e4f121\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc1e37affb67c7652f578e79f104773967044884346ea2159f843bc27b53a324\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a7bca00cbf9f2a1abbbcb015536c61a7fe6f26b37d289e182df0b2dd58d72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24af932f21b1dbc1a2d96edd5cb5256eb0b4ddb672a001581f519bbb71b3f528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9392d565f3b41698f93b7dab52770b90590d7a010c0f29a89c91bcdc16131803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee9526f89756d5aadafe6ce6c208353f3a24aa7126fd4e94c7154b5dee0b14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.839388 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.851097 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.862070 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.873131 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.882400 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.895744 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.906166 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.919388 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.929234 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.939711 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: E0320 15:25:53.939960 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.951510 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.962850 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.973864 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.982078 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:53 crc kubenswrapper[4779]: I0320 15:25:53.993975 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:54 crc kubenswrapper[4779]: I0320 15:25:54.011029 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:39Z\\\",\\\"message\\\":\\\"node crc\\\\nI0320 15:25:39.674068 7380 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-bl2w2 after 0 failed attempt(s)\\\\nI0320 15:25:39.674074 7380 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-bl2w2\\\\nI0320 15:25:39.673052 7380 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:25:39.674099 7380 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 15:25:39.674134 7380 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:39.674193 7380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:54Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:54 crc kubenswrapper[4779]: I0320 15:25:54.027191 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:54Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:54 crc kubenswrapper[4779]: I0320 15:25:54.041100 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:54Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:54 crc kubenswrapper[4779]: I0320 15:25:54.808317 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:54 crc kubenswrapper[4779]: E0320 15:25:54.808525 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:54 crc kubenswrapper[4779]: I0320 15:25:54.808634 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:25:54 crc kubenswrapper[4779]: I0320 15:25:54.808328 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:54 crc kubenswrapper[4779]: E0320 15:25:54.808787 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:25:54 crc kubenswrapper[4779]: E0320 15:25:54.808811 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.807993 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:55 crc kubenswrapper[4779]: E0320 15:25:55.808632 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.808130 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:55 crc kubenswrapper[4779]: E0320 15:25:55.808869 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.916350 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.916377 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.916386 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.916398 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.916406 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:55Z","lastTransitionTime":"2026-03-20T15:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:55 crc kubenswrapper[4779]: E0320 15:25:55.931718 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.936542 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.936722 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.936882 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.937072 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.937262 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:55Z","lastTransitionTime":"2026-03-20T15:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:55 crc kubenswrapper[4779]: E0320 15:25:55.949793 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.954136 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.954182 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.954190 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.954205 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.954216 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:55Z","lastTransitionTime":"2026-03-20T15:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:55 crc kubenswrapper[4779]: E0320 15:25:55.964537 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.967842 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.967872 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.967881 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.967894 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.967903 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:55Z","lastTransitionTime":"2026-03-20T15:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:55 crc kubenswrapper[4779]: E0320 15:25:55.977719 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.980992 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.981026 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.981035 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.981052 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:25:55 crc kubenswrapper[4779]: I0320 15:25:55.981060 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:25:55Z","lastTransitionTime":"2026-03-20T15:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:25:56 crc kubenswrapper[4779]: E0320 15:25:56.001641 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:25:55Z is after 2025-08-24T17:21:41Z" Mar 20 15:25:56 crc kubenswrapper[4779]: E0320 15:25:56.001760 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:25:56 crc kubenswrapper[4779]: I0320 15:25:56.808133 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:56 crc kubenswrapper[4779]: E0320 15:25:56.808255 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:56 crc kubenswrapper[4779]: I0320 15:25:56.808133 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:56 crc kubenswrapper[4779]: E0320 15:25:56.808365 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:57 crc kubenswrapper[4779]: I0320 15:25:57.808242 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:57 crc kubenswrapper[4779]: E0320 15:25:57.808363 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:57 crc kubenswrapper[4779]: I0320 15:25:57.808242 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:57 crc kubenswrapper[4779]: E0320 15:25:57.808428 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:25:58 crc kubenswrapper[4779]: I0320 15:25:58.807777 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:25:58 crc kubenswrapper[4779]: I0320 15:25:58.807777 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:25:58 crc kubenswrapper[4779]: E0320 15:25:58.807979 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:25:58 crc kubenswrapper[4779]: E0320 15:25:58.807902 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:25:58 crc kubenswrapper[4779]: E0320 15:25:58.941197 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:25:59 crc kubenswrapper[4779]: I0320 15:25:59.808297 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:25:59 crc kubenswrapper[4779]: E0320 15:25:59.808428 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:25:59 crc kubenswrapper[4779]: I0320 15:25:59.808297 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:25:59 crc kubenswrapper[4779]: E0320 15:25:59.808555 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:00 crc kubenswrapper[4779]: I0320 15:26:00.808135 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:00 crc kubenswrapper[4779]: E0320 15:26:00.808274 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:00 crc kubenswrapper[4779]: I0320 15:26:00.808513 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:00 crc kubenswrapper[4779]: E0320 15:26:00.808849 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:01 crc kubenswrapper[4779]: I0320 15:26:01.808491 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:01 crc kubenswrapper[4779]: I0320 15:26:01.808572 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:01 crc kubenswrapper[4779]: E0320 15:26:01.808778 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:01 crc kubenswrapper[4779]: E0320 15:26:01.808996 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:02 crc kubenswrapper[4779]: I0320 15:26:02.808772 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:02 crc kubenswrapper[4779]: I0320 15:26:02.808859 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:02 crc kubenswrapper[4779]: E0320 15:26:02.808993 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:02 crc kubenswrapper[4779]: E0320 15:26:02.809187 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.808365 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.808365 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:03 crc kubenswrapper[4779]: E0320 15:26:03.808563 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:03 crc kubenswrapper[4779]: E0320 15:26:03.808608 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.829352 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.846008 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d11cee1edeb3d8a182d0acdc690e3bc259fd87878273b36e3f188837c8b0c7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.860049 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"451fc579-db57-4b36-a775-6d2986de3efc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47f7b00f7a63a5a5c8e3f202de52bb407000c1857930004b53c6402e24ed0cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fs4qg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.871607 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8d5cebc-7fcc-402f-8664-7fe33006d635\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca8bbdb2a7d1e8a987297e5b48e3a6abc5b3b7439b83a7dc6fd746f9c435adac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d8aee02e2c57ee39da68fb337e36cca5deee8d9c73dcd2334a7bc91a81af4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.885660 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"704e9aed-e179-4f17-a678-cbd7f420b9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f952002920726e5f66027fd3a51286ade7cc455a744898b25dd876ddee262838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df476d42816b44acd553ca6e22a6791a58d43799e6ff1299bd22b5ac37a45372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://709e2b897f3f6b02b9677d2324ca04d81ecb5bda684f05f2b3c56c3a0213456f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa138829db36ca4c889841913dbdb4440c6efcdc78edf45ff1379bbbb9c34ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.899833 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2042bcde-ea1f-4477-a4c2-d4f81621a660\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:24:05Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:24:04.383479 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:24:04.383622 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:24:04.384191 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3812834432/tls.crt::/tmp/serving-cert-3812834432/tls.key\\\\\\\"\\\\nI0320 15:24:05.038879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:24:05.045710 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:24:05.045748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:24:05.045822 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:24:05.045841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:24:05.056083 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 15:24:05.056156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:24:05.056823 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:24:05.056863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:24:05.056871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:24:05.056877 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:24:05.056883 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:24:05.058585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.914916 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lfj25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c30ee189-9db1-41af-8a55-29955cbf6712\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:29Z\\\",\\\"message\\\":\\\"2026-03-20T15:24:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb\\\\n2026-03-20T15:24:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_182598d0-e43b-4dfe-9fb4-f5b52b2212bb to /host/opt/cni/bin/\\\\n2026-03-20T15:24:44Z [verbose] multus-daemon started\\\\n2026-03-20T15:24:44Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:25:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n4l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lfj25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.926223 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l9j6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46120a91-b00c-4299-b552-a374d2a78726\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec6b43f8768a56cbe8007e92d991e7037f0084f32c4ad8f19b59bf682e7ac3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkfvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l9j6r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.939350 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bdd151-2a1e-4f14-a095-81b541307138\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blhdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l4gtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: E0320 15:26:03.942970 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.961412 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f27b5011-2d73-40e1-b508-a10e9c6f19a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:25:39Z\\\",\\\"message\\\":\\\"node crc\\\\nI0320 15:25:39.674068 7380 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-bl2w2 after 0 failed attempt(s)\\\\nI0320 15:25:39.674074 7380 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-bl2w2\\\\nI0320 15:25:39.673052 7380 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:25:39.674099 7380 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 15:25:39.674134 7380 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 15:25:39.674193 7380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:25:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcxgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2w2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.979044 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:03 crc kubenswrapper[4779]: I0320 15:26:03.993679 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9addf988-9b4c-4e5e-a5fe-793bab35a52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e2895672601c16ffe9b2cda0ba070ceec1fb87721b12b626a485cb6aae6503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae77685b6bc2d9aea06786fd6132dd0fc7dee57d6e9a139e4118d33a787cb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxnrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-92wlw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:03Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.015394 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01b09be4-9da4-4003-9aea-0804f2e4f121\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc1e37affb67c7652f578e79f104773967044884346ea2159f843bc27b53a324\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2a7bca00cbf9f2a1abbbcb015536c61a7fe6f26b37d289e182df0b2dd58d72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24af932f21b1dbc1a2d96edd5cb5256eb0b4ddb672a001581f519bbb71b3f528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9392d565f3b41698f93b7dab52770b90590d7a010c0f29a89c91bcdc16131803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee9526f89756d5aadafe6ce6c208353f3a24aa7126fd4e94c7154b5dee0b14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7766091cc175f7d15265aa50cf436f2be24e8e6f06e3fe602c5ab467c99bb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ffc474825ae3e6a18f565b6ff9bd95366a30163b869ecf58ea487a94f751ae4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a3252598f32260f11c73fe5379659b6c73447da41aa263c09e6098149e704ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.031830 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b19bd13-f5a9-4c41-9590-8160db6db8e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5bd3d76eb962b45cb136b7d534542ed0119ce0fe39f12d8099a850fefdf012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8af565427c7f7c9ba5b8239eaa0c17a3ef8dd5ff718c932a40556ab6b7fafbb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:23:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:23:05.665464 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:23:05.667452 1 observer_polling.go:159] Starting file observer\\\\nI0320 15:23:05.695556 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:23:05.700879 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 15:23:36.087909 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:23:35Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f15a7c62edad15bdc2846e3876a24078dc7d1560104e7d6c65204c04a4b2cb09\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7940cf334fe66cc687c69dfff6e9f646c2f0a58c2d6a2e0ca68f19f7c85acfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:23:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.053893 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bffaffd57fd5018b79fd2ccbfab9cd112ceeb0f9bb65386e76b59c87e0c94d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.070005 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29c643f492e62e7afcf5f6c5e0617ac37b341cd15574c56c3a930d6ad0fcc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ae75ac6bc68bfaafe8430d20a29cb0b21e0a62ad4fe4784ad593c0d7bd91352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.087665 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.101134 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9dnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5293026d-bcf7-4270-8ad8-59a90e70ab1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f178af736bda407f29540c037200c38d99c937c84da17937d62d1073204f5f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l595l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9dnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.118420 4779 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7a20ef3-86de-4db2-b500-63af002500b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e42abb09bc49e78a03681f36955523d117d8bcd3428abfc87bf4e858018c8dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://476b1fad85259e9ecf8e648518500cd1654861cd287383c18594c2c7688f63a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a48d8479143978053e6272dd4276eb9d4b6056782277c9e2181c886b69bc73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e40db4973764e117f6f1cce83b049f8f42f576f23d7fa4555eeec4837efd7baf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edb5a2b97e62f3864f529806d580473acc7a37848c8deddfab10860adf2eef6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0dcadc034ca787aadd284cdf2ae9b3f25e0a5cb5ee13f04e81086fba40c5c55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c65e4cff9e9b96f14523b03a683c6a25392061935eafaf837d69c3711ccc4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:24:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnnk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:24:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:04Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.807867 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:04 crc kubenswrapper[4779]: I0320 15:26:04.807971 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:04 crc kubenswrapper[4779]: E0320 15:26:04.808103 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:04 crc kubenswrapper[4779]: E0320 15:26:04.808314 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:05 crc kubenswrapper[4779]: I0320 15:26:05.808033 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:05 crc kubenswrapper[4779]: I0320 15:26:05.808285 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:05 crc kubenswrapper[4779]: E0320 15:26:05.808332 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:05 crc kubenswrapper[4779]: E0320 15:26:05.808646 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.254408 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.254494 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.254517 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.254550 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.254572 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:26:06Z","lastTransitionTime":"2026-03-20T15:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:26:06 crc kubenswrapper[4779]: E0320 15:26:06.271285 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:06Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.278466 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.278535 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.278557 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.278589 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.278611 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:26:06Z","lastTransitionTime":"2026-03-20T15:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:26:06 crc kubenswrapper[4779]: E0320 15:26:06.296616 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:06Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.303039 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.303130 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.303153 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.303186 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.303206 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:26:06Z","lastTransitionTime":"2026-03-20T15:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:26:06 crc kubenswrapper[4779]: E0320 15:26:06.325712 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:06Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.331867 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.331937 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.331957 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.331988 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.332008 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:26:06Z","lastTransitionTime":"2026-03-20T15:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:26:06 crc kubenswrapper[4779]: E0320 15:26:06.348955 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:06Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.354955 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.355031 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.355055 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.355089 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.355157 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:26:06Z","lastTransitionTime":"2026-03-20T15:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:26:06 crc kubenswrapper[4779]: E0320 15:26:06.372873 4779 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dffc0140-562f-4f14-a68f-8b97216f21d0\\\",\\\"systemUUID\\\":\\\"7cc3d574-fb22-4342-af05-25b72a2fc8bc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:26:06Z is after 2025-08-24T17:21:41Z" Mar 20 15:26:06 crc kubenswrapper[4779]: E0320 15:26:06.373175 4779 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.808582 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:06 crc kubenswrapper[4779]: I0320 15:26:06.808602 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:06 crc kubenswrapper[4779]: E0320 15:26:06.808878 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:06 crc kubenswrapper[4779]: E0320 15:26:06.809023 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:07 crc kubenswrapper[4779]: I0320 15:26:07.807947 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:07 crc kubenswrapper[4779]: I0320 15:26:07.808287 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:07 crc kubenswrapper[4779]: E0320 15:26:07.808586 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:07 crc kubenswrapper[4779]: E0320 15:26:07.808698 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:08 crc kubenswrapper[4779]: I0320 15:26:08.808862 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:08 crc kubenswrapper[4779]: I0320 15:26:08.808879 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:08 crc kubenswrapper[4779]: E0320 15:26:08.809090 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:08 crc kubenswrapper[4779]: E0320 15:26:08.809376 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:08 crc kubenswrapper[4779]: I0320 15:26:08.809676 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:26:08 crc kubenswrapper[4779]: E0320 15:26:08.809826 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2w2_openshift-ovn-kubernetes(f27b5011-2d73-40e1-b508-a10e9c6f19a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" Mar 20 15:26:08 crc kubenswrapper[4779]: E0320 15:26:08.944819 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:26:09 crc kubenswrapper[4779]: I0320 15:26:09.808130 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:09 crc kubenswrapper[4779]: I0320 15:26:09.808330 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:09 crc kubenswrapper[4779]: E0320 15:26:09.808438 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:09 crc kubenswrapper[4779]: E0320 15:26:09.808738 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:10 crc kubenswrapper[4779]: I0320 15:26:10.808089 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:10 crc kubenswrapper[4779]: E0320 15:26:10.808345 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:10 crc kubenswrapper[4779]: I0320 15:26:10.808522 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:10 crc kubenswrapper[4779]: E0320 15:26:10.808638 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:11 crc kubenswrapper[4779]: I0320 15:26:11.807938 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:11 crc kubenswrapper[4779]: I0320 15:26:11.807938 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:11 crc kubenswrapper[4779]: E0320 15:26:11.808283 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:11 crc kubenswrapper[4779]: E0320 15:26:11.808609 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:12 crc kubenswrapper[4779]: I0320 15:26:12.807796 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:12 crc kubenswrapper[4779]: E0320 15:26:12.807956 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:12 crc kubenswrapper[4779]: I0320 15:26:12.807843 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:12 crc kubenswrapper[4779]: E0320 15:26:12.808175 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:13 crc kubenswrapper[4779]: I0320 15:26:13.808074 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:13 crc kubenswrapper[4779]: I0320 15:26:13.808182 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:13 crc kubenswrapper[4779]: E0320 15:26:13.808251 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:13 crc kubenswrapper[4779]: E0320 15:26:13.808420 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:13 crc kubenswrapper[4779]: I0320 15:26:13.841836 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=47.841534725 podStartE2EDuration="47.841534725s" podCreationTimestamp="2026-03-20 15:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:13.826775892 +0000 UTC m=+190.789291702" watchObservedRunningTime="2026-03-20 15:26:13.841534725 +0000 UTC m=+190.804050535" Mar 20 15:26:13 crc kubenswrapper[4779]: I0320 15:26:13.880560 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=95.880545934 podStartE2EDuration="1m35.880545934s" podCreationTimestamp="2026-03-20 15:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:13.880268386 +0000 UTC m=+190.842784186" watchObservedRunningTime="2026-03-20 15:26:13.880545934 +0000 UTC m=+190.843061734" Mar 20 15:26:13 crc kubenswrapper[4779]: I0320 15:26:13.880707 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=68.880703448 podStartE2EDuration="1m8.880703448s" podCreationTimestamp="2026-03-20 15:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:13.841535135 +0000 UTC m=+190.804050935" watchObservedRunningTime="2026-03-20 15:26:13.880703448 +0000 UTC m=+190.843219248" Mar 20 15:26:13 crc kubenswrapper[4779]: I0320 15:26:13.923983 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l9j6r" podStartSLOduration=145.923965447 podStartE2EDuration="2m25.923965447s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:13.923815993 +0000 UTC m=+190.886331793" watchObservedRunningTime="2026-03-20 15:26:13.923965447 +0000 UTC m=+190.886481247" Mar 20 15:26:13 crc kubenswrapper[4779]: I0320 15:26:13.924091 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lfj25" podStartSLOduration=145.924087521 podStartE2EDuration="2m25.924087521s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:13.914674048 +0000 UTC m=+190.877189868" watchObservedRunningTime="2026-03-20 15:26:13.924087521 +0000 UTC m=+190.886603321" Mar 20 15:26:13 crc kubenswrapper[4779]: E0320 15:26:13.945333 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:26:14 crc kubenswrapper[4779]: I0320 15:26:14.002505 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-92wlw" podStartSLOduration=145.002485631 podStartE2EDuration="2m25.002485631s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:14.002469051 +0000 UTC m=+190.964984851" watchObservedRunningTime="2026-03-20 15:26:14.002485631 +0000 UTC m=+190.965001441" Mar 20 15:26:14 crc kubenswrapper[4779]: I0320 15:26:14.032885 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=40.032868471 podStartE2EDuration="40.032868471s" podCreationTimestamp="2026-03-20 15:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:14.032447009 +0000 UTC m=+190.994962809" watchObservedRunningTime="2026-03-20 15:26:14.032868471 +0000 UTC m=+190.995384271" Mar 20 15:26:14 crc kubenswrapper[4779]: I0320 15:26:14.045404 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=77.045388741 podStartE2EDuration="1m17.045388741s" podCreationTimestamp="2026-03-20 15:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:14.044946719 +0000 UTC m=+191.007462529" watchObservedRunningTime="2026-03-20 15:26:14.045388741 +0000 UTC m=+191.007904541" Mar 20 15:26:14 crc kubenswrapper[4779]: I0320 15:26:14.096850 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9dnjh" podStartSLOduration=146.096803448 podStartE2EDuration="2m26.096803448s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:14.096355024 +0000 UTC m=+191.058870824" watchObservedRunningTime="2026-03-20 15:26:14.096803448 +0000 UTC m=+191.059319258" Mar 20 15:26:14 crc kubenswrapper[4779]: I0320 15:26:14.125915 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-clzkt" podStartSLOduration=145.125891951 podStartE2EDuration="2m25.125891951s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:14.112380673 +0000 UTC m=+191.074896473" watchObservedRunningTime="2026-03-20 15:26:14.125891951 +0000 UTC m=+191.088407771" Mar 20 15:26:14 crc kubenswrapper[4779]: I0320 15:26:14.149056 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podStartSLOduration=146.149037137 podStartE2EDuration="2m26.149037137s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:14.148911013 +0000 UTC m=+191.111426813" watchObservedRunningTime="2026-03-20 15:26:14.149037137 +0000 UTC m=+191.111552937" Mar 20 15:26:14 crc kubenswrapper[4779]: I0320 15:26:14.808492 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:14 crc kubenswrapper[4779]: I0320 15:26:14.808508 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:14 crc kubenswrapper[4779]: E0320 15:26:14.808658 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:14 crc kubenswrapper[4779]: E0320 15:26:14.808834 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:15 crc kubenswrapper[4779]: I0320 15:26:15.808650 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:15 crc kubenswrapper[4779]: I0320 15:26:15.808697 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:15 crc kubenswrapper[4779]: E0320 15:26:15.808791 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:15 crc kubenswrapper[4779]: E0320 15:26:15.808910 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.403490 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.403523 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.403534 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.403547 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.403560 4779 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:26:16Z","lastTransitionTime":"2026-03-20T15:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.442600 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl"] Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.442940 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.445936 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.446086 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.446223 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.447472 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.521330 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b584fa53-de9f-4eca-8f82-597e37264615-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.521408 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b584fa53-de9f-4eca-8f82-597e37264615-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.521480 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b584fa53-de9f-4eca-8f82-597e37264615-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.521520 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b584fa53-de9f-4eca-8f82-597e37264615-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.521542 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b584fa53-de9f-4eca-8f82-597e37264615-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.600452 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/1.log" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.600956 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/0.log" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.601000 4779 generic.go:334] "Generic (PLEG): container finished" podID="c30ee189-9db1-41af-8a55-29955cbf6712" containerID="d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc" exitCode=1 Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.601033 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfj25" event={"ID":"c30ee189-9db1-41af-8a55-29955cbf6712","Type":"ContainerDied","Data":"d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc"} Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.601069 4779 scope.go:117] "RemoveContainer" containerID="e15907a8bd74ecb20c8770af3ef562edcef58ab9812baf653b9e1f543fdb66e6" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.601650 4779 scope.go:117] "RemoveContainer" containerID="d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc" Mar 20 15:26:16 crc kubenswrapper[4779]: E0320 15:26:16.601980 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-lfj25_openshift-multus(c30ee189-9db1-41af-8a55-29955cbf6712)\"" pod="openshift-multus/multus-lfj25" podUID="c30ee189-9db1-41af-8a55-29955cbf6712" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.623702 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b584fa53-de9f-4eca-8f82-597e37264615-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.623766 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b584fa53-de9f-4eca-8f82-597e37264615-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.623870 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b584fa53-de9f-4eca-8f82-597e37264615-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.623951 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b584fa53-de9f-4eca-8f82-597e37264615-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.624012 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b584fa53-de9f-4eca-8f82-597e37264615-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.624737 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b584fa53-de9f-4eca-8f82-597e37264615-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.625036 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b584fa53-de9f-4eca-8f82-597e37264615-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.625386 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b584fa53-de9f-4eca-8f82-597e37264615-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.633723 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b584fa53-de9f-4eca-8f82-597e37264615-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.647424 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b584fa53-de9f-4eca-8f82-597e37264615-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6j8vl\" (UID: \"b584fa53-de9f-4eca-8f82-597e37264615\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.764960 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.808777 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:16 crc kubenswrapper[4779]: E0320 15:26:16.808931 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.809154 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:16 crc kubenswrapper[4779]: E0320 15:26:16.809217 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.933226 4779 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 15:26:16 crc kubenswrapper[4779]: I0320 15:26:16.942070 4779 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 15:26:17 crc kubenswrapper[4779]: I0320 15:26:17.605219 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/1.log" Mar 20 15:26:17 crc kubenswrapper[4779]: I0320 15:26:17.606640 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" event={"ID":"b584fa53-de9f-4eca-8f82-597e37264615","Type":"ContainerStarted","Data":"a9f0b6625fb2af7445275521af275d4cf201710a46ce21099751d238604a1e2a"} Mar 20 15:26:17 crc kubenswrapper[4779]: I0320 15:26:17.606697 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" event={"ID":"b584fa53-de9f-4eca-8f82-597e37264615","Type":"ContainerStarted","Data":"9d73e0a9ab72a3309a31e16e0ad148b0f707177637e9231af19169eaa8166ba0"} Mar 20 15:26:17 crc kubenswrapper[4779]: I0320 15:26:17.619488 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6j8vl" podStartSLOduration=148.619468833 podStartE2EDuration="2m28.619468833s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:17.618536127 +0000 UTC m=+194.581051937" watchObservedRunningTime="2026-03-20 15:26:17.619468833 +0000 UTC m=+194.581984633" Mar 20 15:26:17 crc kubenswrapper[4779]: I0320 15:26:17.808387 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:17 crc kubenswrapper[4779]: I0320 15:26:17.808403 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:17 crc kubenswrapper[4779]: E0320 15:26:17.808514 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:17 crc kubenswrapper[4779]: E0320 15:26:17.808782 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:18 crc kubenswrapper[4779]: I0320 15:26:18.809026 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:18 crc kubenswrapper[4779]: E0320 15:26:18.810028 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:18 crc kubenswrapper[4779]: I0320 15:26:18.809343 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:18 crc kubenswrapper[4779]: E0320 15:26:18.810248 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:18 crc kubenswrapper[4779]: E0320 15:26:18.946758 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:26:19 crc kubenswrapper[4779]: I0320 15:26:19.808066 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:19 crc kubenswrapper[4779]: I0320 15:26:19.808075 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:19 crc kubenswrapper[4779]: E0320 15:26:19.808269 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:19 crc kubenswrapper[4779]: E0320 15:26:19.808292 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:20 crc kubenswrapper[4779]: I0320 15:26:20.807886 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:20 crc kubenswrapper[4779]: I0320 15:26:20.807965 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:20 crc kubenswrapper[4779]: E0320 15:26:20.808021 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:20 crc kubenswrapper[4779]: E0320 15:26:20.808130 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:21 crc kubenswrapper[4779]: I0320 15:26:21.808535 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:21 crc kubenswrapper[4779]: I0320 15:26:21.808608 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:21 crc kubenswrapper[4779]: E0320 15:26:21.808682 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:21 crc kubenswrapper[4779]: E0320 15:26:21.808727 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:22 crc kubenswrapper[4779]: I0320 15:26:22.808737 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:22 crc kubenswrapper[4779]: E0320 15:26:22.808872 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:22 crc kubenswrapper[4779]: I0320 15:26:22.808763 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:22 crc kubenswrapper[4779]: I0320 15:26:22.809528 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:26:22 crc kubenswrapper[4779]: E0320 15:26:22.809702 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:23 crc kubenswrapper[4779]: I0320 15:26:23.567973 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l4gtx"] Mar 20 15:26:23 crc kubenswrapper[4779]: I0320 15:26:23.623780 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/3.log" Mar 20 15:26:23 crc kubenswrapper[4779]: I0320 15:26:23.626220 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerStarted","Data":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} Mar 20 15:26:23 crc kubenswrapper[4779]: I0320 15:26:23.626249 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:23 crc kubenswrapper[4779]: E0320 15:26:23.626369 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:23 crc kubenswrapper[4779]: I0320 15:26:23.626862 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:26:23 crc kubenswrapper[4779]: I0320 15:26:23.651195 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podStartSLOduration=154.65117687 podStartE2EDuration="2m34.65117687s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:23.6503935 +0000 UTC m=+200.612909320" watchObservedRunningTime="2026-03-20 15:26:23.65117687 +0000 UTC m=+200.613692670" Mar 20 15:26:23 crc kubenswrapper[4779]: I0320 15:26:23.808715 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:23 crc kubenswrapper[4779]: I0320 15:26:23.809133 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:23 crc kubenswrapper[4779]: E0320 15:26:23.809820 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:23 crc kubenswrapper[4779]: E0320 15:26:23.810016 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:23 crc kubenswrapper[4779]: E0320 15:26:23.947422 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:26:24 crc kubenswrapper[4779]: I0320 15:26:24.808938 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:24 crc kubenswrapper[4779]: I0320 15:26:24.808938 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:24 crc kubenswrapper[4779]: E0320 15:26:24.809164 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:24 crc kubenswrapper[4779]: E0320 15:26:24.809272 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:25 crc kubenswrapper[4779]: I0320 15:26:25.808665 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:25 crc kubenswrapper[4779]: I0320 15:26:25.808794 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:25 crc kubenswrapper[4779]: E0320 15:26:25.808832 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:25 crc kubenswrapper[4779]: E0320 15:26:25.809208 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:26 crc kubenswrapper[4779]: I0320 15:26:26.808630 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:26 crc kubenswrapper[4779]: E0320 15:26:26.808850 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:26 crc kubenswrapper[4779]: I0320 15:26:26.809267 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:26 crc kubenswrapper[4779]: E0320 15:26:26.809404 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:27 crc kubenswrapper[4779]: I0320 15:26:27.807887 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:27 crc kubenswrapper[4779]: I0320 15:26:27.807938 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:27 crc kubenswrapper[4779]: E0320 15:26:27.808093 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:27 crc kubenswrapper[4779]: E0320 15:26:27.808271 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:28 crc kubenswrapper[4779]: I0320 15:26:28.808777 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:28 crc kubenswrapper[4779]: I0320 15:26:28.808832 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:28 crc kubenswrapper[4779]: E0320 15:26:28.808992 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:28 crc kubenswrapper[4779]: E0320 15:26:28.809224 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:28 crc kubenswrapper[4779]: E0320 15:26:28.949026 4779 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:26:29 crc kubenswrapper[4779]: I0320 15:26:29.829653 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:29 crc kubenswrapper[4779]: I0320 15:26:29.829806 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:29 crc kubenswrapper[4779]: E0320 15:26:29.830435 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:29 crc kubenswrapper[4779]: I0320 15:26:29.830490 4779 scope.go:117] "RemoveContainer" containerID="d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc" Mar 20 15:26:29 crc kubenswrapper[4779]: E0320 15:26:29.830464 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:30 crc kubenswrapper[4779]: I0320 15:26:30.651718 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/1.log" Mar 20 15:26:30 crc kubenswrapper[4779]: I0320 15:26:30.651804 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfj25" event={"ID":"c30ee189-9db1-41af-8a55-29955cbf6712","Type":"ContainerStarted","Data":"73ee7f75298b875e96d17e303cd91ed74454ae5443ef86c092b8d91ef1008c68"} Mar 20 15:26:30 crc kubenswrapper[4779]: I0320 15:26:30.808579 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:30 crc kubenswrapper[4779]: E0320 15:26:30.808717 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:30 crc kubenswrapper[4779]: I0320 15:26:30.808587 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:30 crc kubenswrapper[4779]: E0320 15:26:30.808795 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:31 crc kubenswrapper[4779]: I0320 15:26:31.807931 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:31 crc kubenswrapper[4779]: I0320 15:26:31.808212 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:31 crc kubenswrapper[4779]: E0320 15:26:31.808469 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:31 crc kubenswrapper[4779]: E0320 15:26:31.808567 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:32 crc kubenswrapper[4779]: I0320 15:26:32.808478 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:32 crc kubenswrapper[4779]: I0320 15:26:32.808478 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.808741 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l4gtx" podUID="44bdd151-2a1e-4f14-a095-81b541307138" Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.808791 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:26:32 crc kubenswrapper[4779]: I0320 15:26:32.890523 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:32 crc kubenswrapper[4779]: I0320 15:26:32.890681 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:32 crc kubenswrapper[4779]: I0320 15:26:32.890758 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:32 crc kubenswrapper[4779]: I0320 15:26:32.890788 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.890884 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:28:34.890860538 +0000 UTC m=+331.853376338 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.890916 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.890882 4779 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.890937 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.890937 4779 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.890951 4779 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.890968 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:28:34.8909612 +0000 UTC m=+331.853477000 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.891020 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:28:34.890983491 +0000 UTC m=+331.853499381 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.891040 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:28:34.891031732 +0000 UTC m=+331.853547662 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:26:32 crc kubenswrapper[4779]: I0320 15:26:32.991775 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:32 crc kubenswrapper[4779]: I0320 15:26:32.991869 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.991968 4779 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.991971 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.991997 4779 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.992013 4779 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.992013 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs podName:44bdd151-2a1e-4f14-a095-81b541307138 nodeName:}" failed. No retries permitted until 2026-03-20 15:28:34.99199949 +0000 UTC m=+331.954515290 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs") pod "network-metrics-daemon-l4gtx" (UID: "44bdd151-2a1e-4f14-a095-81b541307138") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 15:26:32 crc kubenswrapper[4779]: E0320 15:26:32.992063 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:28:34.992051511 +0000 UTC m=+331.954567311 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 15:26:33 crc kubenswrapper[4779]: I0320 15:26:33.808388 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:33 crc kubenswrapper[4779]: I0320 15:26:33.808492 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:33 crc kubenswrapper[4779]: E0320 15:26:33.809578 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 15:26:33 crc kubenswrapper[4779]: E0320 15:26:33.809708 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 15:26:34 crc kubenswrapper[4779]: I0320 15:26:34.808319 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:26:34 crc kubenswrapper[4779]: I0320 15:26:34.808382 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:26:34 crc kubenswrapper[4779]: I0320 15:26:34.810894 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 15:26:34 crc kubenswrapper[4779]: I0320 15:26:34.811198 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 15:26:34 crc kubenswrapper[4779]: I0320 15:26:34.811376 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 15:26:34 crc kubenswrapper[4779]: I0320 15:26:34.811526 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 15:26:35 crc kubenswrapper[4779]: I0320 15:26:35.808720 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:26:35 crc kubenswrapper[4779]: I0320 15:26:35.808719 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:26:35 crc kubenswrapper[4779]: I0320 15:26:35.810742 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 15:26:35 crc kubenswrapper[4779]: I0320 15:26:35.810885 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.919681 4779 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.948196 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6mhgz"] Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.948657 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.957882 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.958763 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.958805 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.959312 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.959743 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.960444 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.971709 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sjbmt"] Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.972298 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-spxhw"] Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.973348 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd"] Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.973717 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.974056 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.974445 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.977240 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt"] Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.977700 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.980320 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf"] Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.980821 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.990498 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 15:26:36 crc kubenswrapper[4779]: I0320 15:26:36.990536 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.048822 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.049030 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.049125 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.049146 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.049172 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.049142 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.051042 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qjjfz"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.051424 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ljzk2"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.051686 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.051795 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qjjfz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.058226 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.060984 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061014 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-serving-cert\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061034 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-encryption-config\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061053 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwqn\" (UniqueName: \"kubernetes.io/projected/57d12985-7d5e-4c20-9b2c-9790d454fc4b-kube-api-access-ggwqn\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061070 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061085 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-encryption-config\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061100 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-config\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061132 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061147 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-etcd-serving-ca\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061165 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f3c10f-738c-46df-a158-0f59855e0e7d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061190 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v22ch\" (UniqueName: \"kubernetes.io/projected/f6f3c10f-738c-46df-a158-0f59855e0e7d-kube-api-access-v22ch\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061206 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-config\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061222 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-node-pullsecrets\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061236 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-client-ca\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061253 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-etcd-client\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061267 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-serving-cert\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061280 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79j2t\" (UniqueName: \"kubernetes.io/projected/443dd34a-3cde-4e9b-8014-a8dacc68727b-kube-api-access-79j2t\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061297 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061313 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3987175b-df07-4970-9c38-fd7cc25a2586-images\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061331 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-image-import-ca\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061352 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gszb\" (UniqueName: \"kubernetes.io/projected/b46569bb-4450-4b94-8615-e8c4a3afd495-kube-api-access-5gszb\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061367 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/443dd34a-3cde-4e9b-8014-a8dacc68727b-audit-dir\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061383 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-audit-dir\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061400 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3987175b-df07-4970-9c38-fd7cc25a2586-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061423 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-audit\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061442 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26zp\" (UniqueName: \"kubernetes.io/projected/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-kube-api-access-z26zp\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061457 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdflw\" (UniqueName: \"kubernetes.io/projected/3987175b-df07-4970-9c38-fd7cc25a2586-kube-api-access-jdflw\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061471 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b46569bb-4450-4b94-8615-e8c4a3afd495-serving-cert\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061489 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-client-ca\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061508 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57d12985-7d5e-4c20-9b2c-9790d454fc4b-serving-cert\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061536 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-config\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061557 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-audit-policies\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061575 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3c10f-738c-46df-a158-0f59855e0e7d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061591 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-etcd-client\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.061605 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3987175b-df07-4970-9c38-fd7cc25a2586-config\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.066199 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.070187 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.080045 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.080410 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.080508 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.080607 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.080802 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.080830 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.080915 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.081035 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.081238 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.081421 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.081562 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.081713 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.081935 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.082153 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.082310 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.082454 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.082584 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.082592 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.082712 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.082823 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-btkk5"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.083135 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.083200 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.083265 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.083390 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.083593 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.083710 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.083918 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.084190 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.084344 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.084543 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g25w2"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.085191 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.087307 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.088996 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.089143 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.089441 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.089555 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.089657 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.089769 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.092323 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6m957"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.092874 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.093220 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.093593 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.104035 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.104578 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h48bp"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.104904 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-sl2tf"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.105296 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.105658 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.105880 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.107951 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9sgfn"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.108450 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tdtkz"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.125974 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.126005 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tx4fd"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.126481 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.126593 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.128475 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.128755 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.128910 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.128949 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.129095 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.129248 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.131195 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-t8tpv"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.135890 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.136321 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.137550 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.137574 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.137731 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.137826 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.137956 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138176 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138406 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138663 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138740 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138764 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138796 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138828 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138861 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138894 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138808 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.138973 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.139511 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.141999 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.142393 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.142697 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.142865 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.149390 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.150756 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.152072 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.152071 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.152253 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.152321 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.152460 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.152521 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.152862 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.153061 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.153578 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.153585 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.153784 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.153926 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.155559 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.155740 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.155944 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.156125 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.156221 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.156423 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.156518 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.156550 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.156698 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.156849 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.157059 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.157934 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.158050 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6mhgz"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.158255 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.159317 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.162284 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.163918 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.165959 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.166039 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167369 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167402 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwqn\" (UniqueName: \"kubernetes.io/projected/57d12985-7d5e-4c20-9b2c-9790d454fc4b-kube-api-access-ggwqn\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167420 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167440 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-encryption-config\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167456 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-config\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167475 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9966z\" (UniqueName: \"kubernetes.io/projected/e52093ea-9241-43c1-ae08-d9e87beed327-kube-api-access-9966z\") pod \"downloads-7954f5f757-qjjfz\" (UID: \"e52093ea-9241-43c1-ae08-d9e87beed327\") " pod="openshift-console/downloads-7954f5f757-qjjfz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167492 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c3e8f05-0ca8-4324-b68d-4febc2443832-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167509 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167523 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-etcd-serving-ca\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167538 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f3c10f-738c-46df-a158-0f59855e0e7d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167577 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v22ch\" (UniqueName: \"kubernetes.io/projected/f6f3c10f-738c-46df-a158-0f59855e0e7d-kube-api-access-v22ch\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167595 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167612 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defe5875-5311-4ccc-9360-a0e55e2ccdb9-serving-cert\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167631 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/68436d48-9e60-4450-a505-e75eaabb5e20-machine-approver-tls\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167647 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-config\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167661 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-node-pullsecrets\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167678 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-config\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167693 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-client-ca\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167712 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccddk\" (UniqueName: \"kubernetes.io/projected/68436d48-9e60-4450-a505-e75eaabb5e20-kube-api-access-ccddk\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167735 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-etcd-client\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167754 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/defe5875-5311-4ccc-9360-a0e55e2ccdb9-trusted-ca\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167775 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3e8f05-0ca8-4324-b68d-4febc2443832-serving-cert\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167791 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68436d48-9e60-4450-a505-e75eaabb5e20-config\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167806 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167824 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-serving-cert\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167842 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79j2t\" (UniqueName: \"kubernetes.io/projected/443dd34a-3cde-4e9b-8014-a8dacc68727b-kube-api-access-79j2t\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167859 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167875 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3987175b-df07-4970-9c38-fd7cc25a2586-images\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167891 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-image-import-ca\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167915 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gszb\" (UniqueName: \"kubernetes.io/projected/b46569bb-4450-4b94-8615-e8c4a3afd495-kube-api-access-5gszb\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167931 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68436d48-9e60-4450-a505-e75eaabb5e20-auth-proxy-config\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167947 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/443dd34a-3cde-4e9b-8014-a8dacc68727b-audit-dir\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167963 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-audit-dir\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167978 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3987175b-df07-4970-9c38-fd7cc25a2586-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.167995 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmrp6\" (UniqueName: \"kubernetes.io/projected/defe5875-5311-4ccc-9360-a0e55e2ccdb9-kube-api-access-jmrp6\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.168015 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdflw\" (UniqueName: \"kubernetes.io/projected/3987175b-df07-4970-9c38-fd7cc25a2586-kube-api-access-jdflw\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.168037 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-audit\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.168054 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z26zp\" (UniqueName: \"kubernetes.io/projected/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-kube-api-access-z26zp\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.168072 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-client-ca\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.168086 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b46569bb-4450-4b94-8615-e8c4a3afd495-serving-cert\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.168935 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.171325 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.171482 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172427 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172777 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-service-ca-bundle\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172825 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57d12985-7d5e-4c20-9b2c-9790d454fc4b-serving-cert\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172847 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2tfd\" (UniqueName: \"kubernetes.io/projected/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-kube-api-access-q2tfd\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172882 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-config\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172908 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-audit-policies\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172925 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defe5875-5311-4ccc-9360-a0e55e2ccdb9-config\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172955 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3c10f-738c-46df-a158-0f59855e0e7d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172972 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xscgn\" (UniqueName: \"kubernetes.io/projected/9c3e8f05-0ca8-4324-b68d-4febc2443832-kube-api-access-xscgn\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.172991 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-etcd-client\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.173007 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3987175b-df07-4970-9c38-fd7cc25a2586-config\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.173019 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.173405 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.173459 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.173899 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.176181 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-config\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.176302 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.177162 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f3c10f-738c-46df-a158-0f59855e0e7d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.177387 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.177861 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.177883 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.178432 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-etcd-serving-ca\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.178674 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-node-pullsecrets\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.178761 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.173024 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.179126 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt68l\" (UniqueName: \"kubernetes.io/projected/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-kube-api-access-kt68l\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.179206 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-serving-cert\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.179254 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-encryption-config\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.179294 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-serving-cert\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.179494 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.179691 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-config\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.189610 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkwpf"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.189876 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.193187 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-client-ca\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.193475 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-etcd-client\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.195207 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.195298 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/443dd34a-3cde-4e9b-8014-a8dacc68727b-audit-dir\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.195939 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/443dd34a-3cde-4e9b-8014-a8dacc68727b-audit-policies\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.197257 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-serving-cert\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.197324 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3987175b-df07-4970-9c38-fd7cc25a2586-images\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.197425 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-client-ca\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.197768 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-image-import-ca\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.197933 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-serving-cert\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.197999 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-audit-dir\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.199034 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3987175b-df07-4970-9c38-fd7cc25a2586-config\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.202051 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-audit\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.202138 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.202403 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.203781 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.215670 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/443dd34a-3cde-4e9b-8014-a8dacc68727b-encryption-config\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.215681 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ghwq"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.215785 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.216136 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.216766 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-config\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.216878 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.216986 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3c10f-738c-46df-a158-0f59855e0e7d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.217895 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3987175b-df07-4970-9c38-fd7cc25a2586-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.220014 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b46569bb-4450-4b94-8615-e8c4a3afd495-serving-cert\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.222529 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57d12985-7d5e-4c20-9b2c-9790d454fc4b-serving-cert\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.231677 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-encryption-config\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.231679 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.243120 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-etcd-client\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.245608 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.246141 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.247764 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.248214 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.248574 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.249979 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.250440 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.250819 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.251574 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-klv5b"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.252154 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.254416 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.254782 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567006-jlkgl"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.255087 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jq2mp"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.255497 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.256098 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.257418 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.257946 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sjbmt"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.258144 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.258524 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.260080 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.260678 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.261684 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n67zh"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.262572 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.262755 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.263891 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-spxhw"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.265217 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.267344 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.267836 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ljzk2"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.268879 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g25w2"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.270555 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.276086 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9sgfn"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.277452 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sl2tf"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.278674 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279771 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3e8f05-0ca8-4324-b68d-4febc2443832-serving-cert\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279805 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/849b44ce-b317-4a93-a453-4f36844fbff8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cwvxv\" (UID: \"849b44ce-b317-4a93-a453-4f36844fbff8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279826 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279857 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68436d48-9e60-4450-a505-e75eaabb5e20-config\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279881 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa6319a-1080-4ab8-80f3-38bde93c4b47-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279912 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68436d48-9e60-4450-a505-e75eaabb5e20-auth-proxy-config\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279932 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmrp6\" (UniqueName: \"kubernetes.io/projected/defe5875-5311-4ccc-9360-a0e55e2ccdb9-kube-api-access-jmrp6\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279950 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fa6319a-1080-4ab8-80f3-38bde93c4b47-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.279984 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-service-ca-bundle\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280001 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2tfd\" (UniqueName: \"kubernetes.io/projected/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-kube-api-access-q2tfd\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280023 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-client\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280040 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defe5875-5311-4ccc-9360-a0e55e2ccdb9-config\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280062 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xscgn\" (UniqueName: \"kubernetes.io/projected/9c3e8f05-0ca8-4324-b68d-4febc2443832-kube-api-access-xscgn\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280082 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt68l\" (UniqueName: \"kubernetes.io/projected/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-kube-api-access-kt68l\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280100 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-ca\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280133 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-service-ca\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280149 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhv4\" (UniqueName: \"kubernetes.io/projected/849b44ce-b317-4a93-a453-4f36844fbff8-kube-api-access-kbhv4\") pod \"control-plane-machine-set-operator-78cbb6b69f-cwvxv\" (UID: \"849b44ce-b317-4a93-a453-4f36844fbff8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280167 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-serving-cert\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280186 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5276l\" (UniqueName: \"kubernetes.io/projected/d120d68e-0458-4755-9ca7-3a34c000842d-kube-api-access-5276l\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280207 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/573b21fc-852a-45a1-b9f3-690a8bbda54f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldh2w\" (UID: \"573b21fc-852a-45a1-b9f3-690a8bbda54f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280231 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280249 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9966z\" (UniqueName: \"kubernetes.io/projected/e52093ea-9241-43c1-ae08-d9e87beed327-kube-api-access-9966z\") pod \"downloads-7954f5f757-qjjfz\" (UID: \"e52093ea-9241-43c1-ae08-d9e87beed327\") " pod="openshift-console/downloads-7954f5f757-qjjfz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280267 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c3e8f05-0ca8-4324-b68d-4febc2443832-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280286 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-config\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280314 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280330 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defe5875-5311-4ccc-9360-a0e55e2ccdb9-serving-cert\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280347 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/68436d48-9e60-4450-a505-e75eaabb5e20-machine-approver-tls\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280364 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d120d68e-0458-4755-9ca7-3a34c000842d-serving-cert\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280380 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vlbz\" (UniqueName: \"kubernetes.io/projected/573b21fc-852a-45a1-b9f3-690a8bbda54f-kube-api-access-9vlbz\") pod \"cluster-samples-operator-665b6dd947-ldh2w\" (UID: \"573b21fc-852a-45a1-b9f3-690a8bbda54f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280399 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-config\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280414 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa6319a-1080-4ab8-80f3-38bde93c4b47-config\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280431 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccddk\" (UniqueName: \"kubernetes.io/projected/68436d48-9e60-4450-a505-e75eaabb5e20-kube-api-access-ccddk\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280446 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/defe5875-5311-4ccc-9360-a0e55e2ccdb9-trusted-ca\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.280463 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7k82\" (UniqueName: \"kubernetes.io/projected/54c0c83a-90b3-4075-9171-8af4ceb55ad2-kube-api-access-v7k82\") pod \"migrator-59844c95c7-k68xk\" (UID: \"54c0c83a-90b3-4075-9171-8af4ceb55ad2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.281514 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68436d48-9e60-4450-a505-e75eaabb5e20-auth-proxy-config\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.281585 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-service-ca-bundle\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.281812 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.281840 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68436d48-9e60-4450-a505-e75eaabb5e20-config\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.281956 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-config\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.282696 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.282983 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c3e8f05-0ca8-4324-b68d-4febc2443832-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.283488 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-serving-cert\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.283502 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.284005 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/68436d48-9e60-4450-a505-e75eaabb5e20-machine-approver-tls\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.284349 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qjjfz"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.284886 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.284988 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defe5875-5311-4ccc-9360-a0e55e2ccdb9-config\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.285010 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defe5875-5311-4ccc-9360-a0e55e2ccdb9-serving-cert\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.285252 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/defe5875-5311-4ccc-9360-a0e55e2ccdb9-trusted-ca\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.285353 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkwpf"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.286424 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.287261 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3e8f05-0ca8-4324-b68d-4febc2443832-serving-cert\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.287442 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ghwq"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.288474 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-btkk5"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.289457 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.290415 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.290798 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.291469 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.292431 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jkl6g"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.293614 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.293622 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jkl6g" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.294538 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.295584 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.296788 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tx4fd"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.304407 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.306921 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h48bp"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.310890 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.313322 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.314791 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.317695 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.318160 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.319228 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-klv5b"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.320524 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.321609 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.322646 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.323706 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jkl6g"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.324825 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tdtkz"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.325843 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567006-jlkgl"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.327166 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jq2mp"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.328439 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n67zh"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.329494 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7fhq9"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.331238 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.331253 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.331394 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6pz9l"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.332049 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.332404 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7fhq9"] Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.351499 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381502 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-ca\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381561 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-service-ca\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381584 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhv4\" (UniqueName: \"kubernetes.io/projected/849b44ce-b317-4a93-a453-4f36844fbff8-kube-api-access-kbhv4\") pod \"control-plane-machine-set-operator-78cbb6b69f-cwvxv\" (UID: \"849b44ce-b317-4a93-a453-4f36844fbff8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381604 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/573b21fc-852a-45a1-b9f3-690a8bbda54f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldh2w\" (UID: \"573b21fc-852a-45a1-b9f3-690a8bbda54f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381626 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5276l\" (UniqueName: \"kubernetes.io/projected/d120d68e-0458-4755-9ca7-3a34c000842d-kube-api-access-5276l\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381655 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-config\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381686 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d120d68e-0458-4755-9ca7-3a34c000842d-serving-cert\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381704 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa6319a-1080-4ab8-80f3-38bde93c4b47-config\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381719 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vlbz\" (UniqueName: \"kubernetes.io/projected/573b21fc-852a-45a1-b9f3-690a8bbda54f-kube-api-access-9vlbz\") pod \"cluster-samples-operator-665b6dd947-ldh2w\" (UID: \"573b21fc-852a-45a1-b9f3-690a8bbda54f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381740 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7k82\" (UniqueName: \"kubernetes.io/projected/54c0c83a-90b3-4075-9171-8af4ceb55ad2-kube-api-access-v7k82\") pod \"migrator-59844c95c7-k68xk\" (UID: \"54c0c83a-90b3-4075-9171-8af4ceb55ad2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381760 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/849b44ce-b317-4a93-a453-4f36844fbff8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cwvxv\" (UID: \"849b44ce-b317-4a93-a453-4f36844fbff8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381787 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa6319a-1080-4ab8-80f3-38bde93c4b47-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381843 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fa6319a-1080-4ab8-80f3-38bde93c4b47-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.381877 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-client\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.383258 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-ca\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.383592 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-service-ca\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.384590 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d120d68e-0458-4755-9ca7-3a34c000842d-config\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.385028 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d120d68e-0458-4755-9ca7-3a34c000842d-serving-cert\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.385667 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/573b21fc-852a-45a1-b9f3-690a8bbda54f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldh2w\" (UID: \"573b21fc-852a-45a1-b9f3-690a8bbda54f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.386131 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d120d68e-0458-4755-9ca7-3a34c000842d-etcd-client\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.391568 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.411046 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.431328 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.450652 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.471060 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.490394 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.511335 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.531411 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.551176 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.572089 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.591480 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.610848 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.631999 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.650979 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.655749 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa6319a-1080-4ab8-80f3-38bde93c4b47-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.670701 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.674284 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa6319a-1080-4ab8-80f3-38bde93c4b47-config\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.691150 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.696529 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/849b44ce-b317-4a93-a453-4f36844fbff8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cwvxv\" (UID: \"849b44ce-b317-4a93-a453-4f36844fbff8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.710730 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.731020 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.750802 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.770915 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.791078 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.815812 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.830469 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.864906 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwqn\" (UniqueName: \"kubernetes.io/projected/57d12985-7d5e-4c20-9b2c-9790d454fc4b-kube-api-access-ggwqn\") pod \"route-controller-manager-6576b87f9c-h46dt\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.871427 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.890916 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.910845 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.931283 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.947719 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.951086 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 15:26:37 crc kubenswrapper[4779]: I0320 15:26:37.986856 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v22ch\" (UniqueName: \"kubernetes.io/projected/f6f3c10f-738c-46df-a158-0f59855e0e7d-kube-api-access-v22ch\") pod \"openshift-apiserver-operator-796bbdcf4f-n2qlf\" (UID: \"f6f3c10f-738c-46df-a158-0f59855e0e7d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.005760 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26zp\" (UniqueName: \"kubernetes.io/projected/e4e22379-e251-4cf3-a9d5-f1c3c026eb79-kube-api-access-z26zp\") pod \"apiserver-76f77b778f-sjbmt\" (UID: \"e4e22379-e251-4cf3-a9d5-f1c3c026eb79\") " pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.012694 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.031525 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.051098 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.088855 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79j2t\" (UniqueName: \"kubernetes.io/projected/443dd34a-3cde-4e9b-8014-a8dacc68727b-kube-api-access-79j2t\") pod \"apiserver-7bbb656c7d-ln9qd\" (UID: \"443dd34a-3cde-4e9b-8014-a8dacc68727b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.090992 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.101355 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt"] Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.123438 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdflw\" (UniqueName: \"kubernetes.io/projected/3987175b-df07-4970-9c38-fd7cc25a2586-kube-api-access-jdflw\") pod \"machine-api-operator-5694c8668f-6mhgz\" (UID: \"3987175b-df07-4970-9c38-fd7cc25a2586\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.145300 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gszb\" (UniqueName: \"kubernetes.io/projected/b46569bb-4450-4b94-8615-e8c4a3afd495-kube-api-access-5gszb\") pod \"controller-manager-879f6c89f-spxhw\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.151075 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.170803 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.224689 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.224722 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.224721 4779 request.go:700] Waited for 1.0255488s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.226653 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.226692 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.226696 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.233991 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.256227 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.256276 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.273218 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.309427 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.317415 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.331736 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.351386 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.375457 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.391078 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.410659 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.431078 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.450903 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.470599 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.491176 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.511661 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.531649 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.550211 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.571647 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.590727 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.611483 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.614928 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-spxhw"] Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.615963 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sjbmt"] Mar 20 15:26:38 crc kubenswrapper[4779]: W0320 15:26:38.621438 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb46569bb_4450_4b94_8615_e8c4a3afd495.slice/crio-3c1485930d0e83e60eb1bc9d28794ee620e0f74c568d5b0145438aae016e9fca WatchSource:0}: Error finding container 3c1485930d0e83e60eb1bc9d28794ee620e0f74c568d5b0145438aae016e9fca: Status 404 returned error can't find the container with id 3c1485930d0e83e60eb1bc9d28794ee620e0f74c568d5b0145438aae016e9fca Mar 20 15:26:38 crc kubenswrapper[4779]: W0320 15:26:38.624135 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e22379_e251_4cf3_a9d5_f1c3c026eb79.slice/crio-cdd479220762df79cfbb09023ca6bc1df15cd34bd8f67a8c5ff8ee56fbdfd34b WatchSource:0}: Error finding container cdd479220762df79cfbb09023ca6bc1df15cd34bd8f67a8c5ff8ee56fbdfd34b: Status 404 returned error can't find the container with id cdd479220762df79cfbb09023ca6bc1df15cd34bd8f67a8c5ff8ee56fbdfd34b Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.630811 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.653456 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf"] Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.655676 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.657238 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd"] Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.660716 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6mhgz"] Mar 20 15:26:38 crc kubenswrapper[4779]: W0320 15:26:38.662655 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f3c10f_738c_46df_a158_0f59855e0e7d.slice/crio-ae9c46ed0b937b05d6f85033b6fe3d062b3455bab1d943a34a544e6e9805046a WatchSource:0}: Error finding container ae9c46ed0b937b05d6f85033b6fe3d062b3455bab1d943a34a544e6e9805046a: Status 404 returned error can't find the container with id ae9c46ed0b937b05d6f85033b6fe3d062b3455bab1d943a34a544e6e9805046a Mar 20 15:26:38 crc kubenswrapper[4779]: W0320 15:26:38.668309 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3987175b_df07_4970_9c38_fd7cc25a2586.slice/crio-49b5cfc59293e371a4336579c9e896a8653ec4013138066c62ead2e165af86db WatchSource:0}: Error finding container 49b5cfc59293e371a4336579c9e896a8653ec4013138066c62ead2e165af86db: Status 404 returned error can't find the container with id 49b5cfc59293e371a4336579c9e896a8653ec4013138066c62ead2e165af86db Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.671020 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.675010 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" event={"ID":"3987175b-df07-4970-9c38-fd7cc25a2586","Type":"ContainerStarted","Data":"49b5cfc59293e371a4336579c9e896a8653ec4013138066c62ead2e165af86db"} Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.675804 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" event={"ID":"57d12985-7d5e-4c20-9b2c-9790d454fc4b","Type":"ContainerStarted","Data":"6989c6453c60f53174b0fc78917974b2d9bd4a20f72076deec02528dfb51a6ac"} Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.675826 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" event={"ID":"57d12985-7d5e-4c20-9b2c-9790d454fc4b","Type":"ContainerStarted","Data":"58e74ba6a9adfca929c256366d235cb0cceebf741602481e9465f8f701b06c42"} Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.676545 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.677225 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" event={"ID":"b46569bb-4450-4b94-8615-e8c4a3afd495","Type":"ContainerStarted","Data":"3c1485930d0e83e60eb1bc9d28794ee620e0f74c568d5b0145438aae016e9fca"} Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.677899 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" event={"ID":"e4e22379-e251-4cf3-a9d5-f1c3c026eb79","Type":"ContainerStarted","Data":"cdd479220762df79cfbb09023ca6bc1df15cd34bd8f67a8c5ff8ee56fbdfd34b"} Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.677979 4779 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-h46dt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.678005 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" podUID="57d12985-7d5e-4c20-9b2c-9790d454fc4b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.678705 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" event={"ID":"f6f3c10f-738c-46df-a158-0f59855e0e7d","Type":"ContainerStarted","Data":"ae9c46ed0b937b05d6f85033b6fe3d062b3455bab1d943a34a544e6e9805046a"} Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.679686 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" event={"ID":"443dd34a-3cde-4e9b-8014-a8dacc68727b","Type":"ContainerStarted","Data":"6bd8a616e3df2de9537ef676ca6feb1ee948f3f69aa2bab576a8fc10e3422433"} Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.690672 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.711384 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.731302 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.751052 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.771060 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.791794 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.810544 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.830977 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.851802 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.873376 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.890846 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.911866 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.930571 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.951015 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 15:26:38 crc kubenswrapper[4779]: I0320 15:26:38.971090 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.011148 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xscgn\" (UniqueName: \"kubernetes.io/projected/9c3e8f05-0ca8-4324-b68d-4febc2443832-kube-api-access-xscgn\") pod \"openshift-config-operator-7777fb866f-g25w2\" (UID: \"9c3e8f05-0ca8-4324-b68d-4febc2443832\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.023824 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt68l\" (UniqueName: \"kubernetes.io/projected/bc1adb9e-7682-482c-8162-a9bf3acdfa5b-kube-api-access-kt68l\") pod \"openshift-controller-manager-operator-756b6f6bc6-zv9zm\" (UID: \"bc1adb9e-7682-482c-8162-a9bf3acdfa5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.044354 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmrp6\" (UniqueName: \"kubernetes.io/projected/defe5875-5311-4ccc-9360-a0e55e2ccdb9-kube-api-access-jmrp6\") pod \"console-operator-58897d9998-btkk5\" (UID: \"defe5875-5311-4ccc-9360-a0e55e2ccdb9\") " pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.065924 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9966z\" (UniqueName: \"kubernetes.io/projected/e52093ea-9241-43c1-ae08-d9e87beed327-kube-api-access-9966z\") pod \"downloads-7954f5f757-qjjfz\" (UID: \"e52093ea-9241-43c1-ae08-d9e87beed327\") " pod="openshift-console/downloads-7954f5f757-qjjfz" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.087980 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2tfd\" (UniqueName: \"kubernetes.io/projected/9940c61f-5e5d-4ffb-99e2-0ced2ccc225d-kube-api-access-q2tfd\") pod \"authentication-operator-69f744f599-ljzk2\" (UID: \"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.104054 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccddk\" (UniqueName: \"kubernetes.io/projected/68436d48-9e60-4450-a505-e75eaabb5e20-kube-api-access-ccddk\") pod \"machine-approver-56656f9798-6m957\" (UID: \"68436d48-9e60-4450-a505-e75eaabb5e20\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.111497 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.131282 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.154540 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.164682 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.173500 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.176274 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qjjfz" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.190964 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.210980 4779 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.226375 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.229437 4779 request.go:700] Waited for 1.897900091s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.231138 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.242354 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.252153 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.253060 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.287623 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.294950 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.295183 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 15:26:39 crc kubenswrapper[4779]: W0320 15:26:39.326486 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68436d48_9e60_4450_a505_e75eaabb5e20.slice/crio-07ad5ea1358a9020c0295721dc2f9456b0150696ad12f12a145c983695337638 WatchSource:0}: Error finding container 07ad5ea1358a9020c0295721dc2f9456b0150696ad12f12a145c983695337638: Status 404 returned error can't find the container with id 07ad5ea1358a9020c0295721dc2f9456b0150696ad12f12a145c983695337638 Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.355706 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5276l\" (UniqueName: \"kubernetes.io/projected/d120d68e-0458-4755-9ca7-3a34c000842d-kube-api-access-5276l\") pod \"etcd-operator-b45778765-tdtkz\" (UID: \"d120d68e-0458-4755-9ca7-3a34c000842d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.364259 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhv4\" (UniqueName: \"kubernetes.io/projected/849b44ce-b317-4a93-a453-4f36844fbff8-kube-api-access-kbhv4\") pod \"control-plane-machine-set-operator-78cbb6b69f-cwvxv\" (UID: \"849b44ce-b317-4a93-a453-4f36844fbff8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.379262 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.392237 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vlbz\" (UniqueName: \"kubernetes.io/projected/573b21fc-852a-45a1-b9f3-690a8bbda54f-kube-api-access-9vlbz\") pod \"cluster-samples-operator-665b6dd947-ldh2w\" (UID: \"573b21fc-852a-45a1-b9f3-690a8bbda54f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.423861 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fa6319a-1080-4ab8-80f3-38bde93c4b47-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-57vjj\" (UID: \"2fa6319a-1080-4ab8-80f3-38bde93c4b47\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.433334 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.434323 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7k82\" (UniqueName: \"kubernetes.io/projected/54c0c83a-90b3-4075-9171-8af4ceb55ad2-kube-api-access-v7k82\") pod \"migrator-59844c95c7-k68xk\" (UID: \"54c0c83a-90b3-4075-9171-8af4ceb55ad2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.444347 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450179 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450385 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/424710af-f3c1-4cd2-9072-ae3cd248895d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450420 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb6v4\" (UniqueName: \"kubernetes.io/projected/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-kube-api-access-pb6v4\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450461 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-stats-auth\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450487 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-certificates\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450509 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450532 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-oauth-config\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450553 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-policies\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450577 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75539f3c-ad23-445a-9491-12a010875fb9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450599 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450634 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbrpb\" (UniqueName: \"kubernetes.io/projected/75539f3c-ad23-445a-9491-12a010875fb9-kube-api-access-hbrpb\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450654 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97a8b40a-4c2d-452f-b106-d76600e17bac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450690 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-metrics-tls\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450709 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97a8b40a-4c2d-452f-b106-d76600e17bac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450731 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc040c1d-5241-4b62-87f5-1254db8c5a5b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450749 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-service-ca\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450781 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/75539f3c-ad23-445a-9491-12a010875fb9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450799 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-oauth-serving-cert\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450820 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450856 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450877 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gbfl\" (UniqueName: \"kubernetes.io/projected/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-kube-api-access-8gbfl\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450898 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56964\" (UniqueName: \"kubernetes.io/projected/97a8b40a-4c2d-452f-b106-d76600e17bac-kube-api-access-56964\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450928 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-metrics-certs\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450947 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-trusted-ca-bundle\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450968 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn8gd\" (UniqueName: \"kubernetes.io/projected/723d98da-617e-496c-94fe-5b49f9e8ac13-kube-api-access-bn8gd\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.450988 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-config\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451010 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-dir\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451030 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451064 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zlmw\" (UniqueName: \"kubernetes.io/projected/741b4ee7-0567-4349-8068-f10a9cd8ee68-kube-api-access-7zlmw\") pod \"dns-operator-744455d44c-9sgfn\" (UID: \"741b4ee7-0567-4349-8068-f10a9cd8ee68\") " pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451084 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-tls\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451124 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451146 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91d4316-5211-4517-8193-b8b6100f21fc-service-ca-bundle\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451167 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hchf8\" (UniqueName: \"kubernetes.io/projected/b458c9c5-3878-42e1-995b-713f56d36b25-kube-api-access-hchf8\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451198 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc040c1d-5241-4b62-87f5-1254db8c5a5b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451282 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-trusted-ca\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451303 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn8bx\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-kube-api-access-cn8bx\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451324 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451361 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451380 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/741b4ee7-0567-4349-8068-f10a9cd8ee68-metrics-tls\") pod \"dns-operator-744455d44c-9sgfn\" (UID: \"741b4ee7-0567-4349-8068-f10a9cd8ee68\") " pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451403 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75539f3c-ad23-445a-9491-12a010875fb9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451458 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451478 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-config\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451528 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451552 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-default-certificate\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451574 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451599 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/424710af-f3c1-4cd2-9072-ae3cd248895d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451620 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nmqc\" (UniqueName: \"kubernetes.io/projected/b91d4316-5211-4517-8193-b8b6100f21fc-kube-api-access-6nmqc\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451644 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451666 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451705 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451727 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-trusted-ca\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451750 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc040c1d-5241-4b62-87f5-1254db8c5a5b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451772 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451792 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-bound-sa-token\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451811 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-serving-cert\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.451860 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.454054 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:39.954038144 +0000 UTC m=+216.916553944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.478880 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.553637 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.553744 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.0537256 +0000 UTC m=+217.016241400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554028 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-trusted-ca-bundle\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554058 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d2d540f-2f23-476c-95ec-31e2e5385cc0-metrics-tls\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554078 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb7nc\" (UniqueName: \"kubernetes.io/projected/241498e1-0730-4ae5-afd1-4b99b66fbcf5-kube-api-access-xb7nc\") pod \"package-server-manager-789f6589d5-rq8f8\" (UID: \"241498e1-0730-4ae5-afd1-4b99b66fbcf5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554101 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn8gd\" (UniqueName: \"kubernetes.io/projected/723d98da-617e-496c-94fe-5b49f9e8ac13-kube-api-access-bn8gd\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554137 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-config\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554156 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d2d540f-2f23-476c-95ec-31e2e5385cc0-config-volume\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554175 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d27a4a21-019f-4ba4-9960-7e5c96b271d0-srv-cert\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554198 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-dir\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554217 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554237 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/241498e1-0730-4ae5-afd1-4b99b66fbcf5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rq8f8\" (UID: \"241498e1-0730-4ae5-afd1-4b99b66fbcf5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554261 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2ffc1d7-11f7-4557-a771-99f2a90cdeac-cert\") pod \"ingress-canary-jkl6g\" (UID: \"c2ffc1d7-11f7-4557-a771-99f2a90cdeac\") " pod="openshift-ingress-canary/ingress-canary-jkl6g" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.554466 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-dir\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555245 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-config\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555727 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zlmw\" (UniqueName: \"kubernetes.io/projected/741b4ee7-0567-4349-8068-f10a9cd8ee68-kube-api-access-7zlmw\") pod \"dns-operator-744455d44c-9sgfn\" (UID: \"741b4ee7-0567-4349-8068-f10a9cd8ee68\") " pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555763 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-tls\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555788 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555811 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/782d0d60-1af7-40ea-bf2f-845507e0b054-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555835 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91d4316-5211-4517-8193-b8b6100f21fc-service-ca-bundle\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555862 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hchf8\" (UniqueName: \"kubernetes.io/projected/b458c9c5-3878-42e1-995b-713f56d36b25-kube-api-access-hchf8\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555896 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc040c1d-5241-4b62-87f5-1254db8c5a5b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.555995 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/782d0d60-1af7-40ea-bf2f-845507e0b054-images\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556024 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556047 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1db7e800-1b43-403b-8550-b8e19b4b2f8a-node-bootstrap-token\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556083 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-trusted-ca\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556134 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn8bx\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-kube-api-access-cn8bx\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556166 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vhlq\" (UniqueName: \"kubernetes.io/projected/7d2d540f-2f23-476c-95ec-31e2e5385cc0-kube-api-access-6vhlq\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556196 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556221 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/741b4ee7-0567-4349-8068-f10a9cd8ee68-metrics-tls\") pod \"dns-operator-744455d44c-9sgfn\" (UID: \"741b4ee7-0567-4349-8068-f10a9cd8ee68\") " pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556246 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-csi-data-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556283 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75539f3c-ad23-445a-9491-12a010875fb9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556352 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89237cb2-9dc2-4900-9568-0ee404923e24-proxy-tls\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556387 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-registration-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556433 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556459 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-config\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556481 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-profile-collector-cert\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556519 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4stb\" (UniqueName: \"kubernetes.io/projected/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-kube-api-access-b4stb\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556541 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9aa1b944-c1b5-4f49-9615-9d373dea2b29-apiservice-cert\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556573 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556599 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-default-certificate\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556622 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556647 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/424710af-f3c1-4cd2-9072-ae3cd248895d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556670 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nmqc\" (UniqueName: \"kubernetes.io/projected/b91d4316-5211-4517-8193-b8b6100f21fc-kube-api-access-6nmqc\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556721 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556753 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556799 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvv8\" (UniqueName: \"kubernetes.io/projected/84ee4d33-4d04-496b-b95c-61db87d00cdc-kube-api-access-nmvv8\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556825 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-config\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556850 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-plugins-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556875 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dc48f23d-d4c1-4392-b853-42ef3219bcbe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ghwq\" (UID: \"dc48f23d-d4c1-4392-b853-42ef3219bcbe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556900 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d27a4a21-019f-4ba4-9960-7e5c96b271d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556930 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556953 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-trusted-ca\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556953 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91d4316-5211-4517-8193-b8b6100f21fc-service-ca-bundle\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556980 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc040c1d-5241-4b62-87f5-1254db8c5a5b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.556995 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-trusted-ca-bundle\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557005 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-socket-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557096 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfk9k\" (UniqueName: \"kubernetes.io/projected/d221be2b-add2-48dd-a34a-05b1faa2fe53-kube-api-access-nfk9k\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557151 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-bound-sa-token\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557173 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557197 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-serving-cert\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557220 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh9j5\" (UniqueName: \"kubernetes.io/projected/782d0d60-1af7-40ea-bf2f-845507e0b054-kube-api-access-rh9j5\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557273 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/782d0d60-1af7-40ea-bf2f-845507e0b054-proxy-tls\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557302 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czknv\" (UniqueName: \"kubernetes.io/projected/dc48f23d-d4c1-4392-b853-42ef3219bcbe-kube-api-access-czknv\") pod \"multus-admission-controller-857f4d67dd-6ghwq\" (UID: \"dc48f23d-d4c1-4392-b853-42ef3219bcbe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557327 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9aa1b944-c1b5-4f49-9615-9d373dea2b29-webhook-cert\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557348 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-mountpoint-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557373 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557394 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9aa1b944-c1b5-4f49-9615-9d373dea2b29-tmpfs\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557424 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557424 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557446 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqxrq\" (UniqueName: \"kubernetes.io/projected/89237cb2-9dc2-4900-9568-0ee404923e24-kube-api-access-bqxrq\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557469 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/424710af-f3c1-4cd2-9072-ae3cd248895d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557508 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89237cb2-9dc2-4900-9568-0ee404923e24-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557531 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb6v4\" (UniqueName: \"kubernetes.io/projected/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-kube-api-access-pb6v4\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557555 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvgh\" (UniqueName: \"kubernetes.io/projected/c2ffc1d7-11f7-4557-a771-99f2a90cdeac-kube-api-access-vgvgh\") pod \"ingress-canary-jkl6g\" (UID: \"c2ffc1d7-11f7-4557-a771-99f2a90cdeac\") " pod="openshift-ingress-canary/ingress-canary-jkl6g" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557577 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ee4d33-4d04-496b-b95c-61db87d00cdc-secret-volume\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557596 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-stats-auth\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557641 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-certificates\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557659 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557676 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j5kn\" (UniqueName: \"kubernetes.io/projected/c3cf60a6-49cd-43e9-982b-4673db42fde6-kube-api-access-4j5kn\") pod \"auto-csr-approver-29567006-jlkgl\" (UID: \"c3cf60a6-49cd-43e9-982b-4673db42fde6\") " pod="openshift-infra/auto-csr-approver-29567006-jlkgl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557695 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9j9k\" (UniqueName: \"kubernetes.io/projected/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-kube-api-access-n9j9k\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557725 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-oauth-config\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557744 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-policies\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557763 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ee4d33-4d04-496b-b95c-61db87d00cdc-config-volume\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557783 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-serving-cert\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557802 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75539f3c-ad23-445a-9491-12a010875fb9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557828 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557874 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbrpb\" (UniqueName: \"kubernetes.io/projected/75539f3c-ad23-445a-9491-12a010875fb9-kube-api-access-hbrpb\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557897 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97a8b40a-4c2d-452f-b106-d76600e17bac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557938 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-metrics-tls\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557961 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97a8b40a-4c2d-452f-b106-d76600e17bac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.557982 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc040c1d-5241-4b62-87f5-1254db8c5a5b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558018 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2992b\" (UniqueName: \"kubernetes.io/projected/1db7e800-1b43-403b-8550-b8e19b4b2f8a-kube-api-access-2992b\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558043 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-service-ca\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558063 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1db7e800-1b43-403b-8550-b8e19b4b2f8a-certs\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558083 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7tjh\" (UniqueName: \"kubernetes.io/projected/9aa1b944-c1b5-4f49-9615-9d373dea2b29-kube-api-access-j7tjh\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558149 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d221be2b-add2-48dd-a34a-05b1faa2fe53-signing-cabundle\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558177 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/75539f3c-ad23-445a-9491-12a010875fb9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558185 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558201 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-oauth-serving-cert\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558228 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjq8\" (UniqueName: \"kubernetes.io/projected/d27a4a21-019f-4ba4-9960-7e5c96b271d0-kube-api-access-mtjq8\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558245 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-trusted-ca\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558258 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558296 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d221be2b-add2-48dd-a34a-05b1faa2fe53-signing-key\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558331 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558374 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gbfl\" (UniqueName: \"kubernetes.io/projected/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-kube-api-access-8gbfl\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558398 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56964\" (UniqueName: \"kubernetes.io/projected/97a8b40a-4c2d-452f-b106-d76600e17bac-kube-api-access-56964\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558421 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvsx\" (UniqueName: \"kubernetes.io/projected/f465ee1c-98d0-414f-b53c-4203e5bf29af-kube-api-access-8rvsx\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558455 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-metrics-certs\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558477 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-srv-cert\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.558748 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-config\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.559226 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.059145406 +0000 UTC m=+217.021661276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.562046 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/424710af-f3c1-4cd2-9072-ae3cd248895d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.564356 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.565829 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-tls\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.566457 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97a8b40a-4c2d-452f-b106-d76600e17bac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.567347 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc040c1d-5241-4b62-87f5-1254db8c5a5b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.569066 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75539f3c-ad23-445a-9491-12a010875fb9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.572580 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-service-ca\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.575347 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-policies\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.575595 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.575891 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g25w2"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.576699 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-oauth-serving-cert\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.578216 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-metrics-tls\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.578627 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97a8b40a-4c2d-452f-b106-d76600e17bac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.579164 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-oauth-config\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.579175 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.580190 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.580575 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.580890 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-trusted-ca\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.581882 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-stats-auth\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.582305 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-certificates\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.583461 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.583558 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.583616 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.584135 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.584348 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.584475 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/741b4ee7-0567-4349-8068-f10a9cd8ee68-metrics-tls\") pod \"dns-operator-744455d44c-9sgfn\" (UID: \"741b4ee7-0567-4349-8068-f10a9cd8ee68\") " pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.584878 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.587890 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/75539f3c-ad23-445a-9491-12a010875fb9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.588593 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/424710af-f3c1-4cd2-9072-ae3cd248895d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.593748 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zlmw\" (UniqueName: \"kubernetes.io/projected/741b4ee7-0567-4349-8068-f10a9cd8ee68-kube-api-access-7zlmw\") pod \"dns-operator-744455d44c-9sgfn\" (UID: \"741b4ee7-0567-4349-8068-f10a9cd8ee68\") " pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.600149 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.600806 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-serving-cert\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.603996 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc040c1d-5241-4b62-87f5-1254db8c5a5b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: W0320 15:26:39.612143 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c3e8f05_0ca8_4324_b68d_4febc2443832.slice/crio-454d28ff83773942b4421268366fbdd83270683c0a7d78f2942ef407fc7e1bc9 WatchSource:0}: Error finding container 454d28ff83773942b4421268366fbdd83270683c0a7d78f2942ef407fc7e1bc9: Status 404 returned error can't find the container with id 454d28ff83773942b4421268366fbdd83270683c0a7d78f2942ef407fc7e1bc9 Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.612344 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-metrics-certs\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.614755 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b91d4316-5211-4517-8193-b8b6100f21fc-default-certificate\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.617268 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn8gd\" (UniqueName: \"kubernetes.io/projected/723d98da-617e-496c-94fe-5b49f9e8ac13-kube-api-access-bn8gd\") pod \"oauth-openshift-558db77b4-h48bp\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.635672 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ljzk2"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.636404 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hchf8\" (UniqueName: \"kubernetes.io/projected/b458c9c5-3878-42e1-995b-713f56d36b25-kube-api-access-hchf8\") pod \"marketplace-operator-79b997595-qkwpf\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.644054 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qjjfz"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.655164 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc040c1d-5241-4b62-87f5-1254db8c5a5b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qswvl\" (UID: \"fc040c1d-5241-4b62-87f5-1254db8c5a5b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.655442 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.659673 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.659865 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvgh\" (UniqueName: \"kubernetes.io/projected/c2ffc1d7-11f7-4557-a771-99f2a90cdeac-kube-api-access-vgvgh\") pod \"ingress-canary-jkl6g\" (UID: \"c2ffc1d7-11f7-4557-a771-99f2a90cdeac\") " pod="openshift-ingress-canary/ingress-canary-jkl6g" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.659893 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ee4d33-4d04-496b-b95c-61db87d00cdc-secret-volume\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.659915 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j5kn\" (UniqueName: \"kubernetes.io/projected/c3cf60a6-49cd-43e9-982b-4673db42fde6-kube-api-access-4j5kn\") pod \"auto-csr-approver-29567006-jlkgl\" (UID: \"c3cf60a6-49cd-43e9-982b-4673db42fde6\") " pod="openshift-infra/auto-csr-approver-29567006-jlkgl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.659938 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9j9k\" (UniqueName: \"kubernetes.io/projected/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-kube-api-access-n9j9k\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.659960 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ee4d33-4d04-496b-b95c-61db87d00cdc-config-volume\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.659984 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-serving-cert\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660019 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2992b\" (UniqueName: \"kubernetes.io/projected/1db7e800-1b43-403b-8550-b8e19b4b2f8a-kube-api-access-2992b\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660037 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1db7e800-1b43-403b-8550-b8e19b4b2f8a-certs\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660075 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7tjh\" (UniqueName: \"kubernetes.io/projected/9aa1b944-c1b5-4f49-9615-9d373dea2b29-kube-api-access-j7tjh\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660096 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d221be2b-add2-48dd-a34a-05b1faa2fe53-signing-cabundle\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660161 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtjq8\" (UniqueName: \"kubernetes.io/projected/d27a4a21-019f-4ba4-9960-7e5c96b271d0-kube-api-access-mtjq8\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660180 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d221be2b-add2-48dd-a34a-05b1faa2fe53-signing-key\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660215 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvsx\" (UniqueName: \"kubernetes.io/projected/f465ee1c-98d0-414f-b53c-4203e5bf29af-kube-api-access-8rvsx\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660245 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-srv-cert\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660266 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d2d540f-2f23-476c-95ec-31e2e5385cc0-metrics-tls\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660281 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb7nc\" (UniqueName: \"kubernetes.io/projected/241498e1-0730-4ae5-afd1-4b99b66fbcf5-kube-api-access-xb7nc\") pod \"package-server-manager-789f6589d5-rq8f8\" (UID: \"241498e1-0730-4ae5-afd1-4b99b66fbcf5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660298 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d2d540f-2f23-476c-95ec-31e2e5385cc0-config-volume\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660317 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d27a4a21-019f-4ba4-9960-7e5c96b271d0-srv-cert\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660335 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/241498e1-0730-4ae5-afd1-4b99b66fbcf5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rq8f8\" (UID: \"241498e1-0730-4ae5-afd1-4b99b66fbcf5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660354 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2ffc1d7-11f7-4557-a771-99f2a90cdeac-cert\") pod \"ingress-canary-jkl6g\" (UID: \"c2ffc1d7-11f7-4557-a771-99f2a90cdeac\") " pod="openshift-ingress-canary/ingress-canary-jkl6g" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660375 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/782d0d60-1af7-40ea-bf2f-845507e0b054-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660407 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/782d0d60-1af7-40ea-bf2f-845507e0b054-images\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660428 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1db7e800-1b43-403b-8550-b8e19b4b2f8a-node-bootstrap-token\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660658 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vhlq\" (UniqueName: \"kubernetes.io/projected/7d2d540f-2f23-476c-95ec-31e2e5385cc0-kube-api-access-6vhlq\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660678 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-csi-data-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660710 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89237cb2-9dc2-4900-9568-0ee404923e24-proxy-tls\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660726 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-registration-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660747 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-profile-collector-cert\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.660775 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4stb\" (UniqueName: \"kubernetes.io/projected/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-kube-api-access-b4stb\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.660917 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.160896985 +0000 UTC m=+217.123412775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.665793 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d27a4a21-019f-4ba4-9960-7e5c96b271d0-srv-cert\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666270 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9aa1b944-c1b5-4f49-9615-9d373dea2b29-apiservice-cert\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666349 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666404 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmvv8\" (UniqueName: \"kubernetes.io/projected/84ee4d33-4d04-496b-b95c-61db87d00cdc-kube-api-access-nmvv8\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666422 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-config\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666450 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-plugins-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666474 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-serving-cert\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666478 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dc48f23d-d4c1-4392-b853-42ef3219bcbe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ghwq\" (UID: \"dc48f23d-d4c1-4392-b853-42ef3219bcbe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666524 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d27a4a21-019f-4ba4-9960-7e5c96b271d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666575 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-socket-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666602 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfk9k\" (UniqueName: \"kubernetes.io/projected/d221be2b-add2-48dd-a34a-05b1faa2fe53-kube-api-access-nfk9k\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666641 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh9j5\" (UniqueName: \"kubernetes.io/projected/782d0d60-1af7-40ea-bf2f-845507e0b054-kube-api-access-rh9j5\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666464 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ee4d33-4d04-496b-b95c-61db87d00cdc-config-volume\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666676 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/782d0d60-1af7-40ea-bf2f-845507e0b054-proxy-tls\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666708 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czknv\" (UniqueName: \"kubernetes.io/projected/dc48f23d-d4c1-4392-b853-42ef3219bcbe-kube-api-access-czknv\") pod \"multus-admission-controller-857f4d67dd-6ghwq\" (UID: \"dc48f23d-d4c1-4392-b853-42ef3219bcbe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666727 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9aa1b944-c1b5-4f49-9615-9d373dea2b29-webhook-cert\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666746 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-mountpoint-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666767 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9aa1b944-c1b5-4f49-9615-9d373dea2b29-tmpfs\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666809 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqxrq\" (UniqueName: \"kubernetes.io/projected/89237cb2-9dc2-4900-9568-0ee404923e24-kube-api-access-bqxrq\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.666831 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89237cb2-9dc2-4900-9568-0ee404923e24-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.667920 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89237cb2-9dc2-4900-9568-0ee404923e24-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.669486 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dc48f23d-d4c1-4392-b853-42ef3219bcbe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ghwq\" (UID: \"dc48f23d-d4c1-4392-b853-42ef3219bcbe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.669819 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-registration-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.671734 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d27a4a21-019f-4ba4-9960-7e5c96b271d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.671815 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-socket-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.673535 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89237cb2-9dc2-4900-9568-0ee404923e24-proxy-tls\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.677217 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/782d0d60-1af7-40ea-bf2f-845507e0b054-images\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.678008 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d221be2b-add2-48dd-a34a-05b1faa2fe53-signing-cabundle\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.678366 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.679632 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/241498e1-0730-4ae5-afd1-4b99b66fbcf5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rq8f8\" (UID: \"241498e1-0730-4ae5-afd1-4b99b66fbcf5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.679985 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.179969694 +0000 UTC m=+217.142485494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.680642 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-csi-data-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.680797 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-config\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.681300 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ee4d33-4d04-496b-b95c-61db87d00cdc-secret-volume\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.681349 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-plugins-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.681375 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f465ee1c-98d0-414f-b53c-4203e5bf29af-mountpoint-dir\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.681414 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1db7e800-1b43-403b-8550-b8e19b4b2f8a-node-bootstrap-token\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.681438 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/782d0d60-1af7-40ea-bf2f-845507e0b054-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.681818 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9aa1b944-c1b5-4f49-9615-9d373dea2b29-tmpfs\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.682029 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.682083 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d2d540f-2f23-476c-95ec-31e2e5385cc0-config-volume\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.690418 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d2d540f-2f23-476c-95ec-31e2e5385cc0-metrics-tls\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.690591 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2ffc1d7-11f7-4557-a771-99f2a90cdeac-cert\") pod \"ingress-canary-jkl6g\" (UID: \"c2ffc1d7-11f7-4557-a771-99f2a90cdeac\") " pod="openshift-ingress-canary/ingress-canary-jkl6g" Mar 20 15:26:39 crc kubenswrapper[4779]: W0320 15:26:39.690690 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9940c61f_5e5d_4ffb_99e2_0ced2ccc225d.slice/crio-fe4bdb8c96774466055a0c1581fb020939f4a1564ef069c51a2d85ec5e5cdc54 WatchSource:0}: Error finding container fe4bdb8c96774466055a0c1581fb020939f4a1564ef069c51a2d85ec5e5cdc54: Status 404 returned error can't find the container with id fe4bdb8c96774466055a0c1581fb020939f4a1564ef069c51a2d85ec5e5cdc54 Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.690928 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-profile-collector-cert\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.690937 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn8bx\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-kube-api-access-cn8bx\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.691323 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1db7e800-1b43-403b-8550-b8e19b4b2f8a-certs\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.691525 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9aa1b944-c1b5-4f49-9615-9d373dea2b29-apiservice-cert\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.692876 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-srv-cert\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.693762 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" event={"ID":"68436d48-9e60-4450-a505-e75eaabb5e20","Type":"ContainerStarted","Data":"6a388ebf24e61f899f396cfeb89c8e71d9a53abca65c36359e774570b24eff8c"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.693805 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" event={"ID":"68436d48-9e60-4450-a505-e75eaabb5e20","Type":"ContainerStarted","Data":"07ad5ea1358a9020c0295721dc2f9456b0150696ad12f12a145c983695337638"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.694046 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/782d0d60-1af7-40ea-bf2f-845507e0b054-proxy-tls\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.694561 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjjfz" event={"ID":"e52093ea-9241-43c1-ae08-d9e87beed327","Type":"ContainerStarted","Data":"06afc3a63e77b7d9ef4fefe378820c04b0ebdf1d033f27dd568f49599ed5c052"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.694675 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9aa1b944-c1b5-4f49-9615-9d373dea2b29-webhook-cert\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.694712 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d221be2b-add2-48dd-a34a-05b1faa2fe53-signing-key\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.696279 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" event={"ID":"f6f3c10f-738c-46df-a158-0f59855e0e7d","Type":"ContainerStarted","Data":"9f32146b39e52ebaac9945c8b3bde7c25aa7e77f6f14141c1ea53abe35cc9d78"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.703688 4779 generic.go:334] "Generic (PLEG): container finished" podID="443dd34a-3cde-4e9b-8014-a8dacc68727b" containerID="6ab3e7c9337542b76b380fb2fcb3b3d14fba834afe6bffe618b4d7fe8dd861af" exitCode=0 Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.703751 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" event={"ID":"443dd34a-3cde-4e9b-8014-a8dacc68727b","Type":"ContainerDied","Data":"6ab3e7c9337542b76b380fb2fcb3b3d14fba834afe6bffe618b4d7fe8dd861af"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.707846 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" event={"ID":"9c3e8f05-0ca8-4324-b68d-4febc2443832","Type":"ContainerStarted","Data":"454d28ff83773942b4421268366fbdd83270683c0a7d78f2942ef407fc7e1bc9"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.718337 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" event={"ID":"3987175b-df07-4970-9c38-fd7cc25a2586","Type":"ContainerStarted","Data":"abaf1167be2e7192aeb1780310b53c02fba99e01acbf8d1420c73d87f284ad2b"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.718417 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" event={"ID":"3987175b-df07-4970-9c38-fd7cc25a2586","Type":"ContainerStarted","Data":"7fd8a3869f48da534d9e5737670ca91ca08b4ece20f51f246d471a4da1bb2224"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.718722 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-bound-sa-token\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.724524 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" event={"ID":"b46569bb-4450-4b94-8615-e8c4a3afd495","Type":"ContainerStarted","Data":"33b8d0f5c77fa808e6324ad1886105c853ba85569313c6c6ffcc6bf1c9660eeb"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.727456 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.729988 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60aeffe2-ff1d-4132-bb4e-997a7f5ea106-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dzmcg\" (UID: \"60aeffe2-ff1d-4132-bb4e-997a7f5ea106\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.730121 4779 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-spxhw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.730158 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" podUID="b46569bb-4450-4b94-8615-e8c4a3afd495" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.732071 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tdtkz"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.734578 4779 generic.go:334] "Generic (PLEG): container finished" podID="e4e22379-e251-4cf3-a9d5-f1c3c026eb79" containerID="f935f031c9f2b17acbc993b7260fd52783d409ab37ab9afc0fcbf3b2192b21ad" exitCode=0 Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.736967 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" event={"ID":"e4e22379-e251-4cf3-a9d5-f1c3c026eb79","Type":"ContainerDied","Data":"f935f031c9f2b17acbc993b7260fd52783d409ab37ab9afc0fcbf3b2192b21ad"} Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.741470 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.752937 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nmqc\" (UniqueName: \"kubernetes.io/projected/b91d4316-5211-4517-8193-b8b6100f21fc-kube-api-access-6nmqc\") pod \"router-default-5444994796-t8tpv\" (UID: \"b91d4316-5211-4517-8193-b8b6100f21fc\") " pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.758408 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.765448 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.768883 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gbfl\" (UniqueName: \"kubernetes.io/projected/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-kube-api-access-8gbfl\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.769266 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.769551 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.269528636 +0000 UTC m=+217.232044436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.770035 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.770410 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.270403437 +0000 UTC m=+217.232919237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.787203 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.789311 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.805075 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75539f3c-ad23-445a-9491-12a010875fb9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.809080 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbrpb\" (UniqueName: \"kubernetes.io/projected/75539f3c-ad23-445a-9491-12a010875fb9-kube-api-access-hbrpb\") pod \"cluster-image-registry-operator-dc59b4c8b-m2xxf\" (UID: \"75539f3c-ad23-445a-9491-12a010875fb9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:39 crc kubenswrapper[4779]: W0320 15:26:39.813041 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd120d68e_0458_4755_9ca7_3a34c000842d.slice/crio-c10eba96d226434d44f9eb8b0c573d58189a4225e821b4f88028cfccc449cd4a WatchSource:0}: Error finding container c10eba96d226434d44f9eb8b0c573d58189a4225e821b4f88028cfccc449cd4a: Status 404 returned error can't find the container with id c10eba96d226434d44f9eb8b0c573d58189a4225e821b4f88028cfccc449cd4a Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.830568 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56964\" (UniqueName: \"kubernetes.io/projected/97a8b40a-4c2d-452f-b106-d76600e17bac-kube-api-access-56964\") pod \"kube-storage-version-migrator-operator-b67b599dd-wfm5h\" (UID: \"97a8b40a-4c2d-452f-b106-d76600e17bac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.841135 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.841165 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-btkk5"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.850009 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b843a2c1-e2a0-4759-ab43-6d59cd85d1ca-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kq4bv\" (UID: \"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:39 crc kubenswrapper[4779]: W0320 15:26:39.865332 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefe5875_5311_4ccc_9360_a0e55e2ccdb9.slice/crio-47cad05b5540f5fb59d5ede0572ca4c861563a6804e2a03fda878f2b2f4c2713 WatchSource:0}: Error finding container 47cad05b5540f5fb59d5ede0572ca4c861563a6804e2a03fda878f2b2f4c2713: Status 404 returned error can't find the container with id 47cad05b5540f5fb59d5ede0572ca4c861563a6804e2a03fda878f2b2f4c2713 Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.868858 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb6v4\" (UniqueName: \"kubernetes.io/projected/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-kube-api-access-pb6v4\") pod \"console-f9d7485db-sl2tf\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.871812 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.871968 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.37193281 +0000 UTC m=+217.334448600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: W0320 15:26:39.881782 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1adb9e_7682_482c_8162_a9bf3acdfa5b.slice/crio-d17ee462366928da7c8c2e3e1530b4e00de98a1bfc66460098f9f08918b3ce74 WatchSource:0}: Error finding container d17ee462366928da7c8c2e3e1530b4e00de98a1bfc66460098f9f08918b3ce74: Status 404 returned error can't find the container with id d17ee462366928da7c8c2e3e1530b4e00de98a1bfc66460098f9f08918b3ce74 Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.883200 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.884343 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.384329722 +0000 UTC m=+217.346845522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.890855 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9j9k\" (UniqueName: \"kubernetes.io/projected/bcbf85cd-f480-4b6c-bbcc-300d6d895cda-kube-api-access-n9j9k\") pod \"service-ca-operator-777779d784-klv5b\" (UID: \"bcbf85cd-f480-4b6c-bbcc-300d6d895cda\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.900942 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.907912 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j5kn\" (UniqueName: \"kubernetes.io/projected/c3cf60a6-49cd-43e9-982b-4673db42fde6-kube-api-access-4j5kn\") pod \"auto-csr-approver-29567006-jlkgl\" (UID: \"c3cf60a6-49cd-43e9-982b-4673db42fde6\") " pod="openshift-infra/auto-csr-approver-29567006-jlkgl" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.930177 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2992b\" (UniqueName: \"kubernetes.io/projected/1db7e800-1b43-403b-8550-b8e19b4b2f8a-kube-api-access-2992b\") pod \"machine-config-server-6pz9l\" (UID: \"1db7e800-1b43-403b-8550-b8e19b4b2f8a\") " pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.939376 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6pz9l" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.944387 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.946784 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfk9k\" (UniqueName: \"kubernetes.io/projected/d221be2b-add2-48dd-a34a-05b1faa2fe53-kube-api-access-nfk9k\") pod \"service-ca-9c57cc56f-jq2mp\" (UID: \"d221be2b-add2-48dd-a34a-05b1faa2fe53\") " pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.969241 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh9j5\" (UniqueName: \"kubernetes.io/projected/782d0d60-1af7-40ea-bf2f-845507e0b054-kube-api-access-rh9j5\") pod \"machine-config-operator-74547568cd-5rp8w\" (UID: \"782d0d60-1af7-40ea-bf2f-845507e0b054\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.984591 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:39 crc kubenswrapper[4779]: E0320 15:26:39.985013 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.484998901 +0000 UTC m=+217.447514701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.988163 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w"] Mar 20 15:26:39 crc kubenswrapper[4779]: I0320 15:26:39.991939 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvgh\" (UniqueName: \"kubernetes.io/projected/c2ffc1d7-11f7-4557-a771-99f2a90cdeac-kube-api-access-vgvgh\") pod \"ingress-canary-jkl6g\" (UID: \"c2ffc1d7-11f7-4557-a771-99f2a90cdeac\") " pod="openshift-ingress-canary/ingress-canary-jkl6g" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.007124 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvsx\" (UniqueName: \"kubernetes.io/projected/f465ee1c-98d0-414f-b53c-4203e5bf29af-kube-api-access-8rvsx\") pod \"csi-hostpathplugin-7fhq9\" (UID: \"f465ee1c-98d0-414f-b53c-4203e5bf29af\") " pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.017122 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.026998 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.027369 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h48bp"] Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.039920 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7tjh\" (UniqueName: \"kubernetes.io/projected/9aa1b944-c1b5-4f49-9615-9d373dea2b29-kube-api-access-j7tjh\") pod \"packageserver-d55dfcdfc-mflh2\" (UID: \"9aa1b944-c1b5-4f49-9615-9d373dea2b29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.048330 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.050283 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtjq8\" (UniqueName: \"kubernetes.io/projected/d27a4a21-019f-4ba4-9960-7e5c96b271d0-kube-api-access-mtjq8\") pod \"olm-operator-6b444d44fb-g4ncn\" (UID: \"d27a4a21-019f-4ba4-9960-7e5c96b271d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.062339 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.062495 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9sgfn"] Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.069314 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.071651 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4stb\" (UniqueName: \"kubernetes.io/projected/4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad-kube-api-access-b4stb\") pod \"catalog-operator-68c6474976-7h5q2\" (UID: \"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.086156 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.088430 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmvv8\" (UniqueName: \"kubernetes.io/projected/84ee4d33-4d04-496b-b95c-61db87d00cdc-kube-api-access-nmvv8\") pod \"collect-profiles-29566995-zptlb\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.095093 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.595060859 +0000 UTC m=+217.557576729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.108909 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.114283 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.118687 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vhlq\" (UniqueName: \"kubernetes.io/projected/7d2d540f-2f23-476c-95ec-31e2e5385cc0-kube-api-access-6vhlq\") pod \"dns-default-n67zh\" (UID: \"7d2d540f-2f23-476c-95ec-31e2e5385cc0\") " pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.121252 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.129339 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.135069 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.141586 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czknv\" (UniqueName: \"kubernetes.io/projected/dc48f23d-d4c1-4392-b853-42ef3219bcbe-kube-api-access-czknv\") pod \"multus-admission-controller-857f4d67dd-6ghwq\" (UID: \"dc48f23d-d4c1-4392-b853-42ef3219bcbe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.141828 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.158177 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqxrq\" (UniqueName: \"kubernetes.io/projected/89237cb2-9dc2-4900-9568-0ee404923e24-kube-api-access-bqxrq\") pod \"machine-config-controller-84d6567774-vhpkh\" (UID: \"89237cb2-9dc2-4900-9568-0ee404923e24\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.169866 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb7nc\" (UniqueName: \"kubernetes.io/projected/241498e1-0730-4ae5-afd1-4b99b66fbcf5-kube-api-access-xb7nc\") pod \"package-server-manager-789f6589d5-rq8f8\" (UID: \"241498e1-0730-4ae5-afd1-4b99b66fbcf5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.187351 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.188063 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.195756 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.195953 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.695927354 +0000 UTC m=+217.658443154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.196425 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.201798 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.207900 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jkl6g" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.215914 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.290949 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.291522 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.791511137 +0000 UTC m=+217.754026937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.390518 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.391741 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.392063 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.892048334 +0000 UTC m=+217.854564134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.398224 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.492788 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.493190 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:40.993176166 +0000 UTC m=+217.955691966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.542469 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl"] Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.596544 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.596915 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.096893475 +0000 UTC m=+218.059409275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.597038 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.597385 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.097377116 +0000 UTC m=+218.059892916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.622159 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkwpf"] Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.665996 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sl2tf"] Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.707889 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.708183 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.208140461 +0000 UTC m=+218.170656261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.717009 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.717364 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.217351723 +0000 UTC m=+218.179867523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.743472 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg"] Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.765428 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" event={"ID":"68436d48-9e60-4450-a505-e75eaabb5e20","Type":"ContainerStarted","Data":"60a84c303a01792bdf3aaa696ac1f63ef9f19309fad85f697d59ea7aa6f21607"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.767186 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" event={"ID":"741b4ee7-0567-4349-8068-f10a9cd8ee68","Type":"ContainerStarted","Data":"3e9d015aeca2aaf3ae58b1778d113e2851328337d227ca42cc4df397684cb9dc"} Mar 20 15:26:40 crc kubenswrapper[4779]: W0320 15:26:40.787916 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb458c9c5_3878_42e1_995b_713f56d36b25.slice/crio-bc800091d5b4f91c56b3c3976690879cbbe3c1470ec61046392b065384467807 WatchSource:0}: Error finding container bc800091d5b4f91c56b3c3976690879cbbe3c1470ec61046392b065384467807: Status 404 returned error can't find the container with id bc800091d5b4f91c56b3c3976690879cbbe3c1470ec61046392b065384467807 Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.799133 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" event={"ID":"849b44ce-b317-4a93-a453-4f36844fbff8","Type":"ContainerStarted","Data":"0b6a4d0e48bb06a433ebeea0346e70192dd61f56237eab0c290c074df5aaae60"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.799187 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" event={"ID":"849b44ce-b317-4a93-a453-4f36844fbff8","Type":"ContainerStarted","Data":"c6efbc0457dd1bbff22595c66c4bd3bcab955cd93f86f378616805c55958a3d7"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.810201 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" event={"ID":"e4e22379-e251-4cf3-a9d5-f1c3c026eb79","Type":"ContainerStarted","Data":"f86834a8d03f9aee4ed966dd7551841f9d0f7ac20d34fb5d489e3c2b1d89af47"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.818366 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.819617 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.319595583 +0000 UTC m=+218.282111383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.864380 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjjfz" event={"ID":"e52093ea-9241-43c1-ae08-d9e87beed327","Type":"ContainerStarted","Data":"59e604ea232cb658ff226eb7e611f4912682956ec71d34aafa0bbd0349879ef5"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.864935 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qjjfz" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.865775 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" podStartSLOduration=171.865760813 podStartE2EDuration="2m51.865760813s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:40.864500042 +0000 UTC m=+217.827015842" watchObservedRunningTime="2026-03-20 15:26:40.865760813 +0000 UTC m=+217.828276613" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.871747 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" event={"ID":"bc1adb9e-7682-482c-8162-a9bf3acdfa5b","Type":"ContainerStarted","Data":"d17ee462366928da7c8c2e3e1530b4e00de98a1bfc66460098f9f08918b3ce74"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.871875 4779 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjjfz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.871910 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjjfz" podUID="e52093ea-9241-43c1-ae08-d9e87beed327" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.875711 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv"] Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.918042 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n2qlf" podStartSLOduration=172.918022797 podStartE2EDuration="2m52.918022797s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:40.89309356 +0000 UTC m=+217.855609360" watchObservedRunningTime="2026-03-20 15:26:40.918022797 +0000 UTC m=+217.880538597" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.918689 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2"] Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.919333 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:40 crc kubenswrapper[4779]: E0320 15:26:40.919760 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.41974444 +0000 UTC m=+218.382260230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.929478 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" event={"ID":"54c0c83a-90b3-4075-9171-8af4ceb55ad2","Type":"ContainerStarted","Data":"796b2845a7f37fa9176d3195eaae0ba4b51d50d8fde07be473362938e4cfb8f8"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.940502 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" event={"ID":"2fa6319a-1080-4ab8-80f3-38bde93c4b47","Type":"ContainerStarted","Data":"bc8e51d027f3d9e389b13843464a356a844224b4f6896f64a86f43bf36995e2c"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.944037 4779 generic.go:334] "Generic (PLEG): container finished" podID="9c3e8f05-0ca8-4324-b68d-4febc2443832" containerID="fa458ccd889af9afd42384557192d8f4efe2a969a86db40d233f03c306114c5d" exitCode=0 Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.944080 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" event={"ID":"9c3e8f05-0ca8-4324-b68d-4febc2443832","Type":"ContainerDied","Data":"fa458ccd889af9afd42384557192d8f4efe2a969a86db40d233f03c306114c5d"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.949039 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" event={"ID":"d120d68e-0458-4755-9ca7-3a34c000842d","Type":"ContainerStarted","Data":"c10eba96d226434d44f9eb8b0c573d58189a4225e821b4f88028cfccc449cd4a"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.961245 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" event={"ID":"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d","Type":"ContainerStarted","Data":"7b0c6fbffa0c1ac2b9e0e4e99911c645de181b82b8983ae0202b6fd71b23e247"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.961327 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" event={"ID":"9940c61f-5e5d-4ffb-99e2-0ced2ccc225d","Type":"ContainerStarted","Data":"fe4bdb8c96774466055a0c1581fb020939f4a1564ef069c51a2d85ec5e5cdc54"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.970037 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-btkk5" event={"ID":"defe5875-5311-4ccc-9360-a0e55e2ccdb9","Type":"ContainerStarted","Data":"47cad05b5540f5fb59d5ede0572ca4c861563a6804e2a03fda878f2b2f4c2713"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.970710 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.972003 4779 patch_prober.go:28] interesting pod/console-operator-58897d9998-btkk5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.972063 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-btkk5" podUID="defe5875-5311-4ccc-9360-a0e55e2ccdb9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.974526 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6pz9l" event={"ID":"1db7e800-1b43-403b-8550-b8e19b4b2f8a","Type":"ContainerStarted","Data":"916d1615de5ce8cb54d7ad4731da72834761dde7f36d9f860b8384d6bf477f5a"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.985778 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" event={"ID":"573b21fc-852a-45a1-b9f3-690a8bbda54f","Type":"ContainerStarted","Data":"e916f4f9c091ac297edd8df2f5c64c19a972d5f1f8b93caf655a047c8bf3e524"} Mar 20 15:26:40 crc kubenswrapper[4779]: I0320 15:26:40.991574 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" event={"ID":"723d98da-617e-496c-94fe-5b49f9e8ac13","Type":"ContainerStarted","Data":"2458700043c23792a2782debf64a8858cacebfd1b0961698aabc24797f9bbbc8"} Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.009648 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t8tpv" event={"ID":"b91d4316-5211-4517-8193-b8b6100f21fc","Type":"ContainerStarted","Data":"5d9b6bafcb270a51b25a18a9df43030d30f3993a228670ab302fbc5670e953d4"} Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.022056 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.028668 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.528649978 +0000 UTC m=+218.491165768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.041336 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" event={"ID":"fc040c1d-5241-4b62-87f5-1254db8c5a5b","Type":"ContainerStarted","Data":"3207aab1fe3e8fe3444cde4661c892948263165b0a8c6b8f803ab6a4b78eebe3"} Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.044830 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" event={"ID":"443dd34a-3cde-4e9b-8014-a8dacc68727b","Type":"ContainerStarted","Data":"7e93b83337f3f7b5f242181530c960db3fc22dd9c03f70b0e2b5a12f29d80dc9"} Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.055031 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-6mhgz" podStartSLOduration=172.055008771 podStartE2EDuration="2m52.055008771s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:41.053936473 +0000 UTC m=+218.016452283" watchObservedRunningTime="2026-03-20 15:26:41.055008771 +0000 UTC m=+218.017524571" Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.066032 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.128574 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.132195 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.63217952 +0000 UTC m=+218.594695420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.179179 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.230786 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.232288 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.235310 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.735292682 +0000 UTC m=+218.697808482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.262458 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567006-jlkgl"] Mar 20 15:26:41 crc kubenswrapper[4779]: W0320 15:26:41.303282 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa1b944_c1b5_4f49_9615_9d373dea2b29.slice/crio-cbd5659518bc18e4bc8eece7f4b9d4bcb935f8b2de1310e94899a5a8d0b9837c WatchSource:0}: Error finding container cbd5659518bc18e4bc8eece7f4b9d4bcb935f8b2de1310e94899a5a8d0b9837c: Status 404 returned error can't find the container with id cbd5659518bc18e4bc8eece7f4b9d4bcb935f8b2de1310e94899a5a8d0b9837c Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.336878 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.337175 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.837160783 +0000 UTC m=+218.799676583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.394818 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.404156 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.430426 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.437966 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.438307 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.938292575 +0000 UTC m=+218.900808375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.438396 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.438685 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:41.938676675 +0000 UTC m=+218.901192475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.472958 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.476687 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jq2mp"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.499715 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n67zh"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.523320 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-klv5b"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.540127 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.540569 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.040549256 +0000 UTC m=+219.003065056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.643237 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.643571 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.143554635 +0000 UTC m=+219.106070435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: W0320 15:26:41.656745 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a8b40a_4c2d_452f_b106_d76600e17bac.slice/crio-f0a1200948b9cc3e70ef4ab4402aaa247a793b625f2f4e506b9ee3ebe058beda WatchSource:0}: Error finding container f0a1200948b9cc3e70ef4ab4402aaa247a793b625f2f4e506b9ee3ebe058beda: Status 404 returned error can't find the container with id f0a1200948b9cc3e70ef4ab4402aaa247a793b625f2f4e506b9ee3ebe058beda Mar 20 15:26:41 crc kubenswrapper[4779]: W0320 15:26:41.686034 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd221be2b_add2_48dd_a34a_05b1faa2fe53.slice/crio-8711ad622fedf1f934d691c898d21cfd5dae8855ec3a4b405041b2accdd2f4f3 WatchSource:0}: Error finding container 8711ad622fedf1f934d691c898d21cfd5dae8855ec3a4b405041b2accdd2f4f3: Status 404 returned error can't find the container with id 8711ad622fedf1f934d691c898d21cfd5dae8855ec3a4b405041b2accdd2f4f3 Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.716876 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" podStartSLOduration=172.716860248 podStartE2EDuration="2m52.716860248s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:41.715996137 +0000 UTC m=+218.678511937" watchObservedRunningTime="2026-03-20 15:26:41.716860248 +0000 UTC m=+218.679376048" Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.727947 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.745259 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.745371 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.245352555 +0000 UTC m=+219.207868355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.745691 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.746043 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.246035791 +0000 UTC m=+219.208551591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: W0320 15:26:41.763956 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782d0d60_1af7_40ea_bf2f_845507e0b054.slice/crio-5a41bac6c47c5c33159fa007627687964cd719380b191962f2b8d19960dbf3e9 WatchSource:0}: Error finding container 5a41bac6c47c5c33159fa007627687964cd719380b191962f2b8d19960dbf3e9: Status 404 returned error can't find the container with id 5a41bac6c47c5c33159fa007627687964cd719380b191962f2b8d19960dbf3e9 Mar 20 15:26:41 crc kubenswrapper[4779]: W0320 15:26:41.789767 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ee4d33_4d04_496b_b95c_61db87d00cdc.slice/crio-9b747139aab949268a4d56f7231b6bb82ee5cef1b732d463d532b31144fae5a5 WatchSource:0}: Error finding container 9b747139aab949268a4d56f7231b6bb82ee5cef1b732d463d532b31144fae5a5: Status 404 returned error can't find the container with id 9b747139aab949268a4d56f7231b6bb82ee5cef1b732d463d532b31144fae5a5 Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.848195 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.848462 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.348445916 +0000 UTC m=+219.310961716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.879763 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7fhq9"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.942361 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.945177 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cwvxv" podStartSLOduration=172.945161187 podStartE2EDuration="2m52.945161187s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:41.923873572 +0000 UTC m=+218.886389372" watchObservedRunningTime="2026-03-20 15:26:41.945161187 +0000 UTC m=+218.907676987" Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.950272 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:41 crc kubenswrapper[4779]: E0320 15:26:41.950557 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.450546422 +0000 UTC m=+219.413062222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.959834 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jkl6g"] Mar 20 15:26:41 crc kubenswrapper[4779]: I0320 15:26:41.972794 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ljzk2" podStartSLOduration=173.972776941 podStartE2EDuration="2m53.972776941s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:41.969822107 +0000 UTC m=+218.932337907" watchObservedRunningTime="2026-03-20 15:26:41.972776941 +0000 UTC m=+218.935292741" Mar 20 15:26:42 crc kubenswrapper[4779]: W0320 15:26:42.005268 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ffc1d7_11f7_4557_a771_99f2a90cdeac.slice/crio-d3fdfa9847978062c640275df3f223f40f02dfcb01ba7e8e1bd438a8477ed572 WatchSource:0}: Error finding container d3fdfa9847978062c640275df3f223f40f02dfcb01ba7e8e1bd438a8477ed572: Status 404 returned error can't find the container with id d3fdfa9847978062c640275df3f223f40f02dfcb01ba7e8e1bd438a8477ed572 Mar 20 15:26:42 crc kubenswrapper[4779]: W0320 15:26:42.011640 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf465ee1c_98d0_414f_b53c_4203e5bf29af.slice/crio-7e10c3a2a3b627e4342fef8041bf2219c94394132badd5aa7385e9f10671a132 WatchSource:0}: Error finding container 7e10c3a2a3b627e4342fef8041bf2219c94394132badd5aa7385e9f10671a132: Status 404 returned error can't find the container with id 7e10c3a2a3b627e4342fef8041bf2219c94394132badd5aa7385e9f10671a132 Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.017936 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qjjfz" podStartSLOduration=173.017920376 podStartE2EDuration="2m53.017920376s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.0061203 +0000 UTC m=+218.968636100" watchObservedRunningTime="2026-03-20 15:26:42.017920376 +0000 UTC m=+218.980436176" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.019098 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh"] Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.051276 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.051397 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.551368287 +0000 UTC m=+219.513884087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.051528 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.051821 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.551809709 +0000 UTC m=+219.514325509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.053584 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ghwq"] Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.082146 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" event={"ID":"241498e1-0730-4ae5-afd1-4b99b66fbcf5","Type":"ContainerStarted","Data":"a5ab4016e3429b690e7026a9eeba8bdcc264cbedf4569d398911e7e0338e1f0e"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.107284 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" event={"ID":"f465ee1c-98d0-414f-b53c-4203e5bf29af","Type":"ContainerStarted","Data":"7e10c3a2a3b627e4342fef8041bf2219c94394132badd5aa7385e9f10671a132"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.109353 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" event={"ID":"b458c9c5-3878-42e1-995b-713f56d36b25","Type":"ContainerStarted","Data":"bc800091d5b4f91c56b3c3976690879cbbe3c1470ec61046392b065384467807"} Mar 20 15:26:42 crc kubenswrapper[4779]: W0320 15:26:42.114697 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27a4a21_019f_4ba4_9960_7e5c96b271d0.slice/crio-5ebac45998d565240b356e63517050e8bebda9fc72d22b40e4fdebc4d0dbfc81 WatchSource:0}: Error finding container 5ebac45998d565240b356e63517050e8bebda9fc72d22b40e4fdebc4d0dbfc81: Status 404 returned error can't find the container with id 5ebac45998d565240b356e63517050e8bebda9fc72d22b40e4fdebc4d0dbfc81 Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.135022 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" event={"ID":"c3cf60a6-49cd-43e9-982b-4673db42fde6","Type":"ContainerStarted","Data":"b48aee491b3ed31df43eadf5a3fd0bcb550d8b60c3785c3cbf3a99fa6234ef26"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.135195 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6m957" podStartSLOduration=174.135177694 podStartE2EDuration="2m54.135177694s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.094808259 +0000 UTC m=+219.057324059" watchObservedRunningTime="2026-03-20 15:26:42.135177694 +0000 UTC m=+219.097693494" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.152689 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.153019 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.653004472 +0000 UTC m=+219.615520262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.163744 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" event={"ID":"9aa1b944-c1b5-4f49-9615-9d373dea2b29","Type":"ContainerStarted","Data":"cbd5659518bc18e4bc8eece7f4b9d4bcb935f8b2de1310e94899a5a8d0b9837c"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.170619 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" event={"ID":"97a8b40a-4c2d-452f-b106-d76600e17bac","Type":"ContainerStarted","Data":"f0a1200948b9cc3e70ef4ab4402aaa247a793b625f2f4e506b9ee3ebe058beda"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.173535 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" event={"ID":"2fa6319a-1080-4ab8-80f3-38bde93c4b47","Type":"ContainerStarted","Data":"b2b5df0e9d987cde9568eb52fbde2df2f40553688ca8bf12449b0a58138bb0ef"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.179297 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" event={"ID":"bcbf85cd-f480-4b6c-bbcc-300d6d895cda","Type":"ContainerStarted","Data":"ceb172660c07a28463cea3a9086442cc8278ffa3cea0b8852d69e28cc500b043"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.182571 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" podStartSLOduration=173.182550675 podStartE2EDuration="2m53.182550675s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.17954842 +0000 UTC m=+219.142064220" watchObservedRunningTime="2026-03-20 15:26:42.182550675 +0000 UTC m=+219.145066475" Mar 20 15:26:42 crc kubenswrapper[4779]: W0320 15:26:42.186714 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc48f23d_d4c1_4392_b853_42ef3219bcbe.slice/crio-8efc3204e58d36590904603fbf0d389e2b54640f0fbaee1dceb755ba912ec95a WatchSource:0}: Error finding container 8efc3204e58d36590904603fbf0d389e2b54640f0fbaee1dceb755ba912ec95a: Status 404 returned error can't find the container with id 8efc3204e58d36590904603fbf0d389e2b54640f0fbaee1dceb755ba912ec95a Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.186903 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" event={"ID":"d120d68e-0458-4755-9ca7-3a34c000842d","Type":"ContainerStarted","Data":"2b20c1f80c91f785c08becae74ac514293f63aaefcaba43e31e65ea7fad42284"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.206975 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-btkk5" podStartSLOduration=173.206959998 podStartE2EDuration="2m53.206959998s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.206042516 +0000 UTC m=+219.168558316" watchObservedRunningTime="2026-03-20 15:26:42.206959998 +0000 UTC m=+219.169475798" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.208547 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" event={"ID":"723d98da-617e-496c-94fe-5b49f9e8ac13","Type":"ContainerStarted","Data":"3dfaa164a290c7848c2d94a71ab71b9315d019c44d58949c18d984bb18c687f4"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.209217 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.224261 4779 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-h48bp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.224347 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" podUID="723d98da-617e-496c-94fe-5b49f9e8ac13" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.232388 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" event={"ID":"84ee4d33-4d04-496b-b95c-61db87d00cdc","Type":"ContainerStarted","Data":"9b747139aab949268a4d56f7231b6bb82ee5cef1b732d463d532b31144fae5a5"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.237785 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-btkk5" event={"ID":"defe5875-5311-4ccc-9360-a0e55e2ccdb9","Type":"ContainerStarted","Data":"ea3877973e7572237ceff9c507701492ff13019eb941ecbcfb36e467e97471df"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.238824 4779 patch_prober.go:28] interesting pod/console-operator-58897d9998-btkk5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.238862 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-btkk5" podUID="defe5875-5311-4ccc-9360-a0e55e2ccdb9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.239606 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" event={"ID":"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca","Type":"ContainerStarted","Data":"19e65761af2c8596b1c828b071b0293da82290ddfa2f860efd98be9f803aa8f3"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.244074 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" event={"ID":"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad","Type":"ContainerStarted","Data":"0ec0c2a7aa3aa5e355b5a602a68e609f65138655cc1611a3b61338fd87d1200a"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.249931 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" event={"ID":"54c0c83a-90b3-4075-9171-8af4ceb55ad2","Type":"ContainerStarted","Data":"0cb8b5aee525e85215adcd3d2a1d3b8075c8e3118e6e8ea4c5d97fe1f86ebaec"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.254429 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.256611 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.756592066 +0000 UTC m=+219.719107956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.274938 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tdtkz" podStartSLOduration=173.274918457 podStartE2EDuration="2m53.274918457s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.262354531 +0000 UTC m=+219.224870331" watchObservedRunningTime="2026-03-20 15:26:42.274918457 +0000 UTC m=+219.237434257" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.286703 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" event={"ID":"bc1adb9e-7682-482c-8162-a9bf3acdfa5b","Type":"ContainerStarted","Data":"90496f33aa907d36c4aec72bdb4fb273fceaff54e21a7d3cd06d7d14902b562f"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.296813 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-57vjj" podStartSLOduration=173.296787987 podStartE2EDuration="2m53.296787987s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.293297949 +0000 UTC m=+219.255813749" watchObservedRunningTime="2026-03-20 15:26:42.296787987 +0000 UTC m=+219.259303807" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.318008 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" event={"ID":"782d0d60-1af7-40ea-bf2f-845507e0b054","Type":"ContainerStarted","Data":"5a41bac6c47c5c33159fa007627687964cd719380b191962f2b8d19960dbf3e9"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.337548 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" event={"ID":"60aeffe2-ff1d-4132-bb4e-997a7f5ea106","Type":"ContainerStarted","Data":"1176b62fadbac824c5f12681f0412e5dd890235879eafd2e0f92bbcc1c76395c"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.339391 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zv9zm" podStartSLOduration=173.339374227 podStartE2EDuration="2m53.339374227s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.338432613 +0000 UTC m=+219.300948413" watchObservedRunningTime="2026-03-20 15:26:42.339374227 +0000 UTC m=+219.301890027" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.352023 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" event={"ID":"d221be2b-add2-48dd-a34a-05b1faa2fe53","Type":"ContainerStarted","Data":"8711ad622fedf1f934d691c898d21cfd5dae8855ec3a4b405041b2accdd2f4f3"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.354574 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jkl6g" event={"ID":"c2ffc1d7-11f7-4557-a771-99f2a90cdeac","Type":"ContainerStarted","Data":"d3fdfa9847978062c640275df3f223f40f02dfcb01ba7e8e1bd438a8477ed572"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.359277 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.360301 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:42.860285603 +0000 UTC m=+219.822801403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.384956 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n67zh" event={"ID":"7d2d540f-2f23-476c-95ec-31e2e5385cc0","Type":"ContainerStarted","Data":"13afe986aca5cffac8e3b0684e377da07ad90746e59610446d19be5667eb8358"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.392212 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" event={"ID":"75539f3c-ad23-445a-9491-12a010875fb9","Type":"ContainerStarted","Data":"f5623805ef2a3229e8457cf6fd42aafc9794557160823172b03d519b5717c248"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.394921 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sl2tf" event={"ID":"163fdaa3-a29a-44bb-9da0-97b18da1c2ba","Type":"ContainerStarted","Data":"30e532bbacc4e9ca0196218949fb082b6b37f767de7a8f801f255653b206668d"} Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.398657 4779 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjjfz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.398702 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjjfz" podUID="e52093ea-9241-43c1-ae08-d9e87beed327" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.405978 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" podStartSLOduration=173.405945701 podStartE2EDuration="2m53.405945701s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.399249353 +0000 UTC m=+219.361765153" watchObservedRunningTime="2026-03-20 15:26:42.405945701 +0000 UTC m=+219.368461501" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.406624 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" podStartSLOduration=174.406620378 podStartE2EDuration="2m54.406620378s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:42.378924381 +0000 UTC m=+219.341440181" watchObservedRunningTime="2026-03-20 15:26:42.406620378 +0000 UTC m=+219.369136178" Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.460993 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.500943 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.000926208 +0000 UTC m=+219.963442008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.564609 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.564839 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.064811154 +0000 UTC m=+220.027326964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.565095 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.565970 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.065960493 +0000 UTC m=+220.028476303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.666461 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.666805 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.166789778 +0000 UTC m=+220.129305578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.768396 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.768886 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.268833983 +0000 UTC m=+220.231349783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.869947 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.870283 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.370265673 +0000 UTC m=+220.332781473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:42 crc kubenswrapper[4779]: I0320 15:26:42.972779 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:42 crc kubenswrapper[4779]: E0320 15:26:42.973401 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.473389745 +0000 UTC m=+220.435905545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.014430 4779 ???:1] "http: TLS handshake error from 192.168.126.11:54000: no serving certificate available for the kubelet" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.073772 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.074279 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.574247371 +0000 UTC m=+220.536763171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.091265 4779 ???:1] "http: TLS handshake error from 192.168.126.11:54012: no serving certificate available for the kubelet" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.175683 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.176264 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.676251095 +0000 UTC m=+220.638766895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.213727 4779 ???:1] "http: TLS handshake error from 192.168.126.11:54020: no serving certificate available for the kubelet" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.237284 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.237502 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.279498 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.279637 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.779619144 +0000 UTC m=+220.742134944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.279857 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.280216 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.780205308 +0000 UTC m=+220.742721108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.294451 4779 ???:1] "http: TLS handshake error from 192.168.126.11:54024: no serving certificate available for the kubelet" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.385657 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.386242 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.886226854 +0000 UTC m=+220.848742654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.397607 4779 ???:1] "http: TLS handshake error from 192.168.126.11:54036: no serving certificate available for the kubelet" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.415177 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n67zh" event={"ID":"7d2d540f-2f23-476c-95ec-31e2e5385cc0","Type":"ContainerStarted","Data":"2da3ee55fe51e2d1ff8bae86048e249d437a8e1ef0430a502b8fd571897d837d"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.420822 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" event={"ID":"84ee4d33-4d04-496b-b95c-61db87d00cdc","Type":"ContainerStarted","Data":"9535441a36321bb8ad0abd81aaff16dab4c8d9811e6c564e93f5d48fd6300955"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.430707 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" event={"ID":"dc48f23d-d4c1-4392-b853-42ef3219bcbe","Type":"ContainerStarted","Data":"b7cdac1d85c2a3905d4b318375169b12c43470c16092df7182326a92de1f469c"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.430744 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" event={"ID":"dc48f23d-d4c1-4392-b853-42ef3219bcbe","Type":"ContainerStarted","Data":"8efc3204e58d36590904603fbf0d389e2b54640f0fbaee1dceb755ba912ec95a"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.456354 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" event={"ID":"b458c9c5-3878-42e1-995b-713f56d36b25","Type":"ContainerStarted","Data":"66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.456795 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.460503 4779 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qkwpf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.460555 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" podUID="b458c9c5-3878-42e1-995b-713f56d36b25" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.462833 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" event={"ID":"782d0d60-1af7-40ea-bf2f-845507e0b054","Type":"ContainerStarted","Data":"8107d8a9931bc44c591f60f59c29b452251f2ef583008d228cc3394529ef9eb1"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.483291 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" podStartSLOduration=174.483271253 podStartE2EDuration="2m54.483271253s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.478541084 +0000 UTC m=+220.441056884" watchObservedRunningTime="2026-03-20 15:26:43.483271253 +0000 UTC m=+220.445787043" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.484583 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" podStartSLOduration=175.484577246 podStartE2EDuration="2m55.484577246s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.452439608 +0000 UTC m=+220.414955408" watchObservedRunningTime="2026-03-20 15:26:43.484577246 +0000 UTC m=+220.447093046" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.487347 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.488204 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:43.988187707 +0000 UTC m=+220.950703507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.489147 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" event={"ID":"573b21fc-852a-45a1-b9f3-690a8bbda54f","Type":"ContainerStarted","Data":"a412c77c73e6d5b66b99c50ed7746137fde3533136b349720fdba66481a654e5"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.489193 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" event={"ID":"573b21fc-852a-45a1-b9f3-690a8bbda54f","Type":"ContainerStarted","Data":"c83e42e04404422d15f4fde54180366766471b08c8640ce78d86af9045347452"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.509433 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" event={"ID":"fc040c1d-5241-4b62-87f5-1254db8c5a5b","Type":"ContainerStarted","Data":"692a13470c6a82c045b78c4d60f746a10238e4f86f80265cc8e1eadc5cb79042"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.512233 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" event={"ID":"d27a4a21-019f-4ba4-9960-7e5c96b271d0","Type":"ContainerStarted","Data":"5c3950770f5b096bdaaeec89afac3f0b07b38682f1c4d957f7abf36ea05bb46f"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.512260 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" event={"ID":"d27a4a21-019f-4ba4-9960-7e5c96b271d0","Type":"ContainerStarted","Data":"5ebac45998d565240b356e63517050e8bebda9fc72d22b40e4fdebc4d0dbfc81"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.512777 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.519254 4779 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-g4ncn container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.519304 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" podUID="d27a4a21-019f-4ba4-9960-7e5c96b271d0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.538127 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldh2w" podStartSLOduration=174.538094831 podStartE2EDuration="2m54.538094831s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.537385444 +0000 UTC m=+220.499901244" watchObservedRunningTime="2026-03-20 15:26:43.538094831 +0000 UTC m=+220.500610621" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.557529 4779 ???:1] "http: TLS handshake error from 192.168.126.11:50296: no serving certificate available for the kubelet" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.558328 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" event={"ID":"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca","Type":"ContainerStarted","Data":"07bca3febb10637553e90fb25fe8dfc32505a1bf107e5c1fb8d4f4eed5d1ba22"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.558368 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" event={"ID":"b843a2c1-e2a0-4759-ab43-6d59cd85d1ca","Type":"ContainerStarted","Data":"069a81f87944f12e36608ea277032071ea46b880d0c790edc2106db086194228"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.563376 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6pz9l" event={"ID":"1db7e800-1b43-403b-8550-b8e19b4b2f8a","Type":"ContainerStarted","Data":"59853051cbb5e5b859cabea8620f8a2ac244d49160f9451efc8aa200793b8ffd"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.570298 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" event={"ID":"75539f3c-ad23-445a-9491-12a010875fb9","Type":"ContainerStarted","Data":"2025f168319d808e080f409b4dd20adfedd1479a5f6719032fa085986b932e3b"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.575406 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" event={"ID":"97a8b40a-4c2d-452f-b106-d76600e17bac","Type":"ContainerStarted","Data":"b474371e6430f25ba7a13a2e9ec73e31337193d4eb352394d37332479cb28224"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.579714 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" event={"ID":"bcbf85cd-f480-4b6c-bbcc-300d6d895cda","Type":"ContainerStarted","Data":"879c4f9f8c9cc8238d64fe4bbc524ff22d2d9c4a7173efd6364c35b626df4fcc"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.588689 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.589182 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.089160475 +0000 UTC m=+221.051676275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.589292 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.591475 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.091461072 +0000 UTC m=+221.053977082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.597809 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" event={"ID":"241498e1-0730-4ae5-afd1-4b99b66fbcf5","Type":"ContainerStarted","Data":"c8705e763944d9d44a568d3c38ed7c45bf5c22cee83606a3d5cb7ae090f368ec"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.625971 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" event={"ID":"d221be2b-add2-48dd-a34a-05b1faa2fe53","Type":"ContainerStarted","Data":"33d45379710784db09d268109cec5c1c8da6b1d2f3de692172602f472701f8ec"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.643644 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jkl6g" event={"ID":"c2ffc1d7-11f7-4557-a771-99f2a90cdeac","Type":"ContainerStarted","Data":"f9c2e1521e340ef1d287e511a6bdf720b3af30ea24510a14b1bf0f8de2f09dcb"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.647892 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qswvl" podStartSLOduration=174.647876111 podStartE2EDuration="2m54.647876111s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.598088029 +0000 UTC m=+220.560603849" watchObservedRunningTime="2026-03-20 15:26:43.647876111 +0000 UTC m=+220.610391911" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.650003 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" podStartSLOduration=174.649994354 podStartE2EDuration="2m54.649994354s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.647551672 +0000 UTC m=+220.610067472" watchObservedRunningTime="2026-03-20 15:26:43.649994354 +0000 UTC m=+220.612510154" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.652679 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" event={"ID":"9aa1b944-c1b5-4f49-9615-9d373dea2b29","Type":"ContainerStarted","Data":"9d79404e8bc33fd8a314029eb900849f217a56b56e5342a7a4c0240da8876921"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.655566 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.659862 4779 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-mflh2 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.659921 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" podUID="9aa1b944-c1b5-4f49-9615-9d373dea2b29" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.666761 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" event={"ID":"89237cb2-9dc2-4900-9568-0ee404923e24","Type":"ContainerStarted","Data":"76f6f2a34d77f4cc45e5c3f654dcbef5b262523737a7e7efea3b60b33f9cf46e"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.666822 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" event={"ID":"89237cb2-9dc2-4900-9568-0ee404923e24","Type":"ContainerStarted","Data":"2d4de907eac2d8015105787b59cca255461d095a85a7380247368b1fe41ab837"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.690896 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" event={"ID":"e4e22379-e251-4cf3-a9d5-f1c3c026eb79","Type":"ContainerStarted","Data":"4900f2655f5e78f540397070e16024bb538621a03d2f1029169f8b20e8ebd2fb"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.691867 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.693736 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.193718063 +0000 UTC m=+221.156233863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.703894 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klv5b" podStartSLOduration=174.703876799 podStartE2EDuration="2m54.703876799s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.703236252 +0000 UTC m=+220.665752062" watchObservedRunningTime="2026-03-20 15:26:43.703876799 +0000 UTC m=+220.666392599" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.706295 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wfm5h" podStartSLOduration=174.706286429 podStartE2EDuration="2m54.706286429s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.678359567 +0000 UTC m=+220.640875377" watchObservedRunningTime="2026-03-20 15:26:43.706286429 +0000 UTC m=+220.668802229" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.706915 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t8tpv" event={"ID":"b91d4316-5211-4517-8193-b8b6100f21fc","Type":"ContainerStarted","Data":"2644251b752bdebf51784423cd21e1e20d6ed0037e0cbc7255d63cc5488b4625"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.719666 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dzmcg" event={"ID":"60aeffe2-ff1d-4132-bb4e-997a7f5ea106","Type":"ContainerStarted","Data":"c1f9bb02c7cbbf402c6084315decf66a39968a2cfdf535e7d69976e17a8661a0"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.762401 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sl2tf" event={"ID":"163fdaa3-a29a-44bb-9da0-97b18da1c2ba","Type":"ContainerStarted","Data":"9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.783711 4779 ???:1] "http: TLS handshake error from 192.168.126.11:50306: no serving certificate available for the kubelet" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.785623 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jq2mp" podStartSLOduration=174.785603243 podStartE2EDuration="2m54.785603243s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.784125556 +0000 UTC m=+220.746641366" watchObservedRunningTime="2026-03-20 15:26:43.785603243 +0000 UTC m=+220.748119043" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.786510 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m2xxf" podStartSLOduration=174.786503006 podStartE2EDuration="2m54.786503006s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.756366248 +0000 UTC m=+220.718882048" watchObservedRunningTime="2026-03-20 15:26:43.786503006 +0000 UTC m=+220.749018806" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.803068 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" event={"ID":"4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad","Type":"ContainerStarted","Data":"03931dd782e3bbe216d6873f777eb61682273092b7d3858f4880ed4f03d2f4de"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.804790 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.804882 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.804939 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.805751 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.305738359 +0000 UTC m=+221.268254159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.812381 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jkl6g" podStartSLOduration=7.812362646 podStartE2EDuration="7.812362646s" podCreationTimestamp="2026-03-20 15:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.812080289 +0000 UTC m=+220.774596089" watchObservedRunningTime="2026-03-20 15:26:43.812362646 +0000 UTC m=+220.774878446" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.812511 4779 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7h5q2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.812562 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" podUID="4e8fbeee-8833-4bc8-9a9a-eaf0f5fb28ad" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.825498 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" event={"ID":"54c0c83a-90b3-4075-9171-8af4ceb55ad2","Type":"ContainerStarted","Data":"ff91f14afb6c14579c57617780ffec9012ac05e76312d65ce425c7e1db598a46"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.842600 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" event={"ID":"741b4ee7-0567-4349-8068-f10a9cd8ee68","Type":"ContainerStarted","Data":"70056f2f1dd442d607d44fcdbd6d6b07d39631f34580a428187581e669165e1e"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.847900 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6pz9l" podStartSLOduration=6.847883918 podStartE2EDuration="6.847883918s" podCreationTimestamp="2026-03-20 15:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.84552169 +0000 UTC m=+220.808037490" watchObservedRunningTime="2026-03-20 15:26:43.847883918 +0000 UTC m=+220.810399718" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.863236 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" event={"ID":"9c3e8f05-0ca8-4324-b68d-4febc2443832","Type":"ContainerStarted","Data":"033a2b6a8657458bccc2eec3885d7f4a274ca71d6b4221fc3417e667cea513e9"} Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.864032 4779 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjjfz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.864124 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjjfz" podUID="e52093ea-9241-43c1-ae08-d9e87beed327" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.865023 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.881333 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ln9qd" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.889598 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kq4bv" podStartSLOduration=174.889574277 podStartE2EDuration="2m54.889574277s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.868721592 +0000 UTC m=+220.831237402" watchObservedRunningTime="2026-03-20 15:26:43.889574277 +0000 UTC m=+220.852090077" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.906280 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:43 crc kubenswrapper[4779]: E0320 15:26:43.907544 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.407529168 +0000 UTC m=+221.370044968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.951928 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" podStartSLOduration=175.951904094 podStartE2EDuration="2m55.951904094s" podCreationTimestamp="2026-03-20 15:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.949543214 +0000 UTC m=+220.912059004" watchObservedRunningTime="2026-03-20 15:26:43.951904094 +0000 UTC m=+220.914419894" Mar 20 15:26:43 crc kubenswrapper[4779]: I0320 15:26:43.973295 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" podStartSLOduration=174.973279351 podStartE2EDuration="2m54.973279351s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:43.969854305 +0000 UTC m=+220.932370105" watchObservedRunningTime="2026-03-20 15:26:43.973279351 +0000 UTC m=+220.935795141" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.003082 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-sl2tf" podStartSLOduration=175.003045529 podStartE2EDuration="2m55.003045529s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.002500446 +0000 UTC m=+220.965016256" watchObservedRunningTime="2026-03-20 15:26:44.003045529 +0000 UTC m=+220.965561329" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.008881 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.009249 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.509234685 +0000 UTC m=+221.471750485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.020240 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.027736 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:44 crc kubenswrapper[4779]: [-]has-synced failed: reason withheld Mar 20 15:26:44 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:44 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.027776 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.054698 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" podStartSLOduration=175.054682327 podStartE2EDuration="2m55.054682327s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.053175849 +0000 UTC m=+221.015691659" watchObservedRunningTime="2026-03-20 15:26:44.054682327 +0000 UTC m=+221.017198127" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.055977 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-t8tpv" podStartSLOduration=175.05597186 podStartE2EDuration="2m55.05597186s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.02732018 +0000 UTC m=+220.989835980" watchObservedRunningTime="2026-03-20 15:26:44.05597186 +0000 UTC m=+221.018487660" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.101254 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" podStartSLOduration=175.101233217 podStartE2EDuration="2m55.101233217s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.098743555 +0000 UTC m=+221.061259365" watchObservedRunningTime="2026-03-20 15:26:44.101233217 +0000 UTC m=+221.063749017" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.109457 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.109631 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.609613298 +0000 UTC m=+221.572129098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.109713 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.110177 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.610169192 +0000 UTC m=+221.572684992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.140659 4779 ???:1] "http: TLS handshake error from 192.168.126.11:50314: no serving certificate available for the kubelet" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.183282 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k68xk" podStartSLOduration=175.18326758 podStartE2EDuration="2m55.18326758s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.159647556 +0000 UTC m=+221.122163356" watchObservedRunningTime="2026-03-20 15:26:44.18326758 +0000 UTC m=+221.145783380" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.206405 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" podStartSLOduration=175.20638569 podStartE2EDuration="2m55.20638569s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.186279135 +0000 UTC m=+221.148794935" watchObservedRunningTime="2026-03-20 15:26:44.20638569 +0000 UTC m=+221.168901490" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.214735 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.215049 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.715033169 +0000 UTC m=+221.677548969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.230542 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" podStartSLOduration=175.230525208 podStartE2EDuration="2m55.230525208s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.208838212 +0000 UTC m=+221.171354012" watchObservedRunningTime="2026-03-20 15:26:44.230525208 +0000 UTC m=+221.193041008" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.317656 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.318121 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.818085409 +0000 UTC m=+221.780601209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.418508 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.418723 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.918693488 +0000 UTC m=+221.881209308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.418850 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.419459 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:44.919444377 +0000 UTC m=+221.881960177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.520073 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.520295 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.020270552 +0000 UTC m=+221.982786352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.520589 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.520911 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.020900117 +0000 UTC m=+221.983415917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.621262 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.621633 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.121615729 +0000 UTC m=+222.084131539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.725873 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.726771 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.226757882 +0000 UTC m=+222.189273682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.828037 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.828373 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.328358947 +0000 UTC m=+222.290874747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.837855 4779 ???:1] "http: TLS handshake error from 192.168.126.11:50316: no serving certificate available for the kubelet" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.863393 4779 patch_prober.go:28] interesting pod/console-operator-58897d9998-btkk5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.863451 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-btkk5" podUID="defe5875-5311-4ccc-9360-a0e55e2ccdb9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.866892 4779 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-h48bp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.866947 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" podUID="723d98da-617e-496c-94fe-5b49f9e8ac13" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.872956 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" event={"ID":"782d0d60-1af7-40ea-bf2f-845507e0b054","Type":"ContainerStarted","Data":"9bb5b3622c8818b4f9a635c462673012157d4011802278c632d73712271a900f"} Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.885817 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vhpkh" event={"ID":"89237cb2-9dc2-4900-9568-0ee404923e24","Type":"ContainerStarted","Data":"d87015530d77018c900e8ab5a96011885937959a65cda72ac0febb2fceb2f1d8"} Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.894228 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n67zh" event={"ID":"7d2d540f-2f23-476c-95ec-31e2e5385cc0","Type":"ContainerStarted","Data":"e580ed56067ecb9c92d6bbd90f34dd9cd58b45c9644f1a8630047c9442efc6f8"} Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.894398 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.903563 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9sgfn" event={"ID":"741b4ee7-0567-4349-8068-f10a9cd8ee68","Type":"ContainerStarted","Data":"2113657f3eeaa7d214d612ff78d472e7e21633cba71a37af73db5a6289964364"} Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.918604 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" event={"ID":"241498e1-0730-4ae5-afd1-4b99b66fbcf5","Type":"ContainerStarted","Data":"10453bc7ef507ee37f7f76cf1a4a9af824348433bc59998366979a2332241bd1"} Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.919017 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5rp8w" podStartSLOduration=175.918996414 podStartE2EDuration="2m55.918996414s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.917918898 +0000 UTC m=+221.880434698" watchObservedRunningTime="2026-03-20 15:26:44.918996414 +0000 UTC m=+221.881512214" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.919267 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.929244 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:44 crc kubenswrapper[4779]: E0320 15:26:44.930269 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.430256507 +0000 UTC m=+222.392772307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.934459 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" event={"ID":"f465ee1c-98d0-414f-b53c-4203e5bf29af","Type":"ContainerStarted","Data":"d7af3785b29209455c3c675d6425d03e68975c08e9419f86165f9f9c874705c5"} Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.956233 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n67zh" podStartSLOduration=8.956217690999999 podStartE2EDuration="8.956217691s" podCreationTimestamp="2026-03-20 15:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:44.954809165 +0000 UTC m=+221.917324965" watchObservedRunningTime="2026-03-20 15:26:44.956217691 +0000 UTC m=+221.918733491" Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.960546 4779 generic.go:334] "Generic (PLEG): container finished" podID="84ee4d33-4d04-496b-b95c-61db87d00cdc" containerID="9535441a36321bb8ad0abd81aaff16dab4c8d9811e6c564e93f5d48fd6300955" exitCode=0 Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.960641 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" event={"ID":"84ee4d33-4d04-496b-b95c-61db87d00cdc","Type":"ContainerDied","Data":"9535441a36321bb8ad0abd81aaff16dab4c8d9811e6c564e93f5d48fd6300955"} Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.982060 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" event={"ID":"dc48f23d-d4c1-4392-b853-42ef3219bcbe","Type":"ContainerStarted","Data":"f11238e57e193b721e340eee9c62536a467e2068b7a0c21658ef5dfa1ac5ad01"} Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.986790 4779 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qkwpf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 20 15:26:44 crc kubenswrapper[4779]: I0320 15:26:44.986826 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" podUID="b458c9c5-3878-42e1-995b-713f56d36b25" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.025660 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4ncn" Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.030156 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:45 crc kubenswrapper[4779]: [-]has-synced failed: reason withheld Mar 20 15:26:45 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:45 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.030197 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.030815 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.032694 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.532681233 +0000 UTC m=+222.495197033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.050057 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" podStartSLOduration=176.050041699 podStartE2EDuration="2m56.050041699s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:45.015442189 +0000 UTC m=+221.977957989" watchObservedRunningTime="2026-03-20 15:26:45.050041699 +0000 UTC m=+222.012557499" Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.067791 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7h5q2" Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.085602 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ghwq" podStartSLOduration=176.085589912 podStartE2EDuration="2m56.085589912s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:45.084604428 +0000 UTC m=+222.047120228" watchObservedRunningTime="2026-03-20 15:26:45.085589912 +0000 UTC m=+222.048105712" Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.141258 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.141587 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.64157222 +0000 UTC m=+222.604088020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.242557 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.243244 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.743225875 +0000 UTC m=+222.705741665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.325144 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.344468 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.344760 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.844748078 +0000 UTC m=+222.807263878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.446096 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.446337 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.946309641 +0000 UTC m=+222.908825441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.446512 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.446822 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:45.946809943 +0000 UTC m=+222.909325743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.497123 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-spxhw"] Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.497381 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" podUID="b46569bb-4450-4b94-8615-e8c4a3afd495" containerName="controller-manager" containerID="cri-o://33b8d0f5c77fa808e6324ad1886105c853ba85569313c6c6ffcc6bf1c9660eeb" gracePeriod=30 Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.537738 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt"] Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.537932 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" podUID="57d12985-7d5e-4c20-9b2c-9790d454fc4b" containerName="route-controller-manager" containerID="cri-o://6989c6453c60f53174b0fc78917974b2d9bd4a20f72076deec02528dfb51a6ac" gracePeriod=30 Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.547974 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.548230 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.048199192 +0000 UTC m=+223.010715002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.548525 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.548942 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.04893111 +0000 UTC m=+223.011446910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.649580 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.649986 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.14997167 +0000 UTC m=+223.112487460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.750981 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.751478 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.251456182 +0000 UTC m=+223.213971982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.834571 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mflh2" Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.853016 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.853435 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.353416935 +0000 UTC m=+223.315932745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.954844 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:45 crc kubenswrapper[4779]: E0320 15:26:45.955221 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.455205333 +0000 UTC m=+223.417721133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.998664 4779 generic.go:334] "Generic (PLEG): container finished" podID="57d12985-7d5e-4c20-9b2c-9790d454fc4b" containerID="6989c6453c60f53174b0fc78917974b2d9bd4a20f72076deec02528dfb51a6ac" exitCode=0 Mar 20 15:26:45 crc kubenswrapper[4779]: I0320 15:26:45.998736 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" event={"ID":"57d12985-7d5e-4c20-9b2c-9790d454fc4b","Type":"ContainerDied","Data":"6989c6453c60f53174b0fc78917974b2d9bd4a20f72076deec02528dfb51a6ac"} Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.025045 4779 generic.go:334] "Generic (PLEG): container finished" podID="b46569bb-4450-4b94-8615-e8c4a3afd495" containerID="33b8d0f5c77fa808e6324ad1886105c853ba85569313c6c6ffcc6bf1c9660eeb" exitCode=0 Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.025241 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" event={"ID":"b46569bb-4450-4b94-8615-e8c4a3afd495","Type":"ContainerDied","Data":"33b8d0f5c77fa808e6324ad1886105c853ba85569313c6c6ffcc6bf1c9660eeb"} Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.027772 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:46 crc kubenswrapper[4779]: [-]has-synced failed: reason withheld Mar 20 15:26:46 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:46 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.027872 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.033363 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.057543 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.057889 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.557873514 +0000 UTC m=+223.520389314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.066178 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.158334 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-proxy-ca-bundles\") pod \"b46569bb-4450-4b94-8615-e8c4a3afd495\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.158383 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-client-ca\") pod \"b46569bb-4450-4b94-8615-e8c4a3afd495\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.158417 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gszb\" (UniqueName: \"kubernetes.io/projected/b46569bb-4450-4b94-8615-e8c4a3afd495-kube-api-access-5gszb\") pod \"b46569bb-4450-4b94-8615-e8c4a3afd495\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.158444 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-config\") pod \"b46569bb-4450-4b94-8615-e8c4a3afd495\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.158461 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b46569bb-4450-4b94-8615-e8c4a3afd495-serving-cert\") pod \"b46569bb-4450-4b94-8615-e8c4a3afd495\" (UID: \"b46569bb-4450-4b94-8615-e8c4a3afd495\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.158947 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.160415 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b46569bb-4450-4b94-8615-e8c4a3afd495" (UID: "b46569bb-4450-4b94-8615-e8c4a3afd495"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.160799 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-client-ca" (OuterVolumeSpecName: "client-ca") pod "b46569bb-4450-4b94-8615-e8c4a3afd495" (UID: "b46569bb-4450-4b94-8615-e8c4a3afd495"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.161071 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.661056429 +0000 UTC m=+223.623572219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.164744 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-config" (OuterVolumeSpecName: "config") pod "b46569bb-4450-4b94-8615-e8c4a3afd495" (UID: "b46569bb-4450-4b94-8615-e8c4a3afd495"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.184292 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46569bb-4450-4b94-8615-e8c4a3afd495-kube-api-access-5gszb" (OuterVolumeSpecName: "kube-api-access-5gszb") pod "b46569bb-4450-4b94-8615-e8c4a3afd495" (UID: "b46569bb-4450-4b94-8615-e8c4a3afd495"). InnerVolumeSpecName "kube-api-access-5gszb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.184388 4779 ???:1] "http: TLS handshake error from 192.168.126.11:50322: no serving certificate available for the kubelet" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.184459 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46569bb-4450-4b94-8615-e8c4a3afd495-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b46569bb-4450-4b94-8615-e8c4a3afd495" (UID: "b46569bb-4450-4b94-8615-e8c4a3afd495"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.216316 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g25w2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.225774 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qtlz2"] Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.226037 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46569bb-4450-4b94-8615-e8c4a3afd495" containerName="controller-manager" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.226050 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46569bb-4450-4b94-8615-e8c4a3afd495" containerName="controller-manager" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.226166 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46569bb-4450-4b94-8615-e8c4a3afd495" containerName="controller-manager" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.226951 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.228153 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qtlz2"] Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.237158 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.266724 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.266899 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.766875388 +0000 UTC m=+223.729391188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.266958 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.267215 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.267231 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.267240 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gszb\" (UniqueName: \"kubernetes.io/projected/b46569bb-4450-4b94-8615-e8c4a3afd495-kube-api-access-5gszb\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.267250 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46569bb-4450-4b94-8615-e8c4a3afd495-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.267260 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b46569bb-4450-4b94-8615-e8c4a3afd495-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.267501 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.767491954 +0000 UTC m=+223.730007764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.298515 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.370521 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-config\") pod \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.370585 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-client-ca\") pod \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.370650 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggwqn\" (UniqueName: \"kubernetes.io/projected/57d12985-7d5e-4c20-9b2c-9790d454fc4b-kube-api-access-ggwqn\") pod \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.370694 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57d12985-7d5e-4c20-9b2c-9790d454fc4b-serving-cert\") pod \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\" (UID: \"57d12985-7d5e-4c20-9b2c-9790d454fc4b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.370811 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.371048 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-catalog-content\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.371082 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfpp\" (UniqueName: \"kubernetes.io/projected/f20db228-a34a-4734-ae22-53cd86de06ed-kube-api-access-7qfpp\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.371153 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-utilities\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.371428 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-config" (OuterVolumeSpecName: "config") pod "57d12985-7d5e-4c20-9b2c-9790d454fc4b" (UID: "57d12985-7d5e-4c20-9b2c-9790d454fc4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.371524 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.871509289 +0000 UTC m=+223.834025089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.371725 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-client-ca" (OuterVolumeSpecName: "client-ca") pod "57d12985-7d5e-4c20-9b2c-9790d454fc4b" (UID: "57d12985-7d5e-4c20-9b2c-9790d454fc4b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.383256 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d12985-7d5e-4c20-9b2c-9790d454fc4b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "57d12985-7d5e-4c20-9b2c-9790d454fc4b" (UID: "57d12985-7d5e-4c20-9b2c-9790d454fc4b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.385723 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d12985-7d5e-4c20-9b2c-9790d454fc4b-kube-api-access-ggwqn" (OuterVolumeSpecName: "kube-api-access-ggwqn") pod "57d12985-7d5e-4c20-9b2c-9790d454fc4b" (UID: "57d12985-7d5e-4c20-9b2c-9790d454fc4b"). InnerVolumeSpecName "kube-api-access-ggwqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.396395 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.405245 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mhgrq"] Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.405427 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d12985-7d5e-4c20-9b2c-9790d454fc4b" containerName="route-controller-manager" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.405438 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d12985-7d5e-4c20-9b2c-9790d454fc4b" containerName="route-controller-manager" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.405455 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ee4d33-4d04-496b-b95c-61db87d00cdc" containerName="collect-profiles" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.405462 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ee4d33-4d04-496b-b95c-61db87d00cdc" containerName="collect-profiles" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.405540 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ee4d33-4d04-496b-b95c-61db87d00cdc" containerName="collect-profiles" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.405550 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d12985-7d5e-4c20-9b2c-9790d454fc4b" containerName="route-controller-manager" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.406157 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.410360 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.432640 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhgrq"] Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472464 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmvv8\" (UniqueName: \"kubernetes.io/projected/84ee4d33-4d04-496b-b95c-61db87d00cdc-kube-api-access-nmvv8\") pod \"84ee4d33-4d04-496b-b95c-61db87d00cdc\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472646 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ee4d33-4d04-496b-b95c-61db87d00cdc-secret-volume\") pod \"84ee4d33-4d04-496b-b95c-61db87d00cdc\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472681 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ee4d33-4d04-496b-b95c-61db87d00cdc-config-volume\") pod \"84ee4d33-4d04-496b-b95c-61db87d00cdc\" (UID: \"84ee4d33-4d04-496b-b95c-61db87d00cdc\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472806 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhsch\" (UniqueName: \"kubernetes.io/projected/57ad9993-5b81-4487-8f70-37e41aca1678-kube-api-access-zhsch\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472864 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-catalog-content\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472894 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfpp\" (UniqueName: \"kubernetes.io/projected/f20db228-a34a-4734-ae22-53cd86de06ed-kube-api-access-7qfpp\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472937 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472962 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-utilities\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.472995 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-utilities\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.473013 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-catalog-content\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.473069 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.473080 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57d12985-7d5e-4c20-9b2c-9790d454fc4b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.473089 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggwqn\" (UniqueName: \"kubernetes.io/projected/57d12985-7d5e-4c20-9b2c-9790d454fc4b-kube-api-access-ggwqn\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.473097 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57d12985-7d5e-4c20-9b2c-9790d454fc4b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.475851 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ee4d33-4d04-496b-b95c-61db87d00cdc-kube-api-access-nmvv8" (OuterVolumeSpecName: "kube-api-access-nmvv8") pod "84ee4d33-4d04-496b-b95c-61db87d00cdc" (UID: "84ee4d33-4d04-496b-b95c-61db87d00cdc"). InnerVolumeSpecName "kube-api-access-nmvv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.477571 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84ee4d33-4d04-496b-b95c-61db87d00cdc-config-volume" (OuterVolumeSpecName: "config-volume") pod "84ee4d33-4d04-496b-b95c-61db87d00cdc" (UID: "84ee4d33-4d04-496b-b95c-61db87d00cdc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.478003 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-catalog-content\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.478019 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-utilities\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.478269 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:46.978257302 +0000 UTC m=+223.940773102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.488477 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ee4d33-4d04-496b-b95c-61db87d00cdc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "84ee4d33-4d04-496b-b95c-61db87d00cdc" (UID: "84ee4d33-4d04-496b-b95c-61db87d00cdc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.509820 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfpp\" (UniqueName: \"kubernetes.io/projected/f20db228-a34a-4734-ae22-53cd86de06ed-kube-api-access-7qfpp\") pod \"community-operators-qtlz2\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.574517 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.574666 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:47.074646925 +0000 UTC m=+224.037162725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.574749 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhsch\" (UniqueName: \"kubernetes.io/projected/57ad9993-5b81-4487-8f70-37e41aca1678-kube-api-access-zhsch\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.574810 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.574855 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-utilities\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.574874 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-catalog-content\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.574908 4779 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ee4d33-4d04-496b-b95c-61db87d00cdc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.574919 4779 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ee4d33-4d04-496b-b95c-61db87d00cdc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.574928 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmvv8\" (UniqueName: \"kubernetes.io/projected/84ee4d33-4d04-496b-b95c-61db87d00cdc-kube-api-access-nmvv8\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.575254 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:47.07523723 +0000 UTC m=+224.037753020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.575302 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-catalog-content\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.575465 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-utilities\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.585136 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.610795 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhsch\" (UniqueName: \"kubernetes.io/projected/57ad9993-5b81-4487-8f70-37e41aca1678-kube-api-access-zhsch\") pod \"certified-operators-mhgrq\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.618565 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khpjc"] Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.619386 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.646188 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khpjc"] Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.676072 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.676304 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l28wf\" (UniqueName: \"kubernetes.io/projected/6039db56-150f-48d0-84a3-511c6a67daad-kube-api-access-l28wf\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.676382 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-utilities\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.676415 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-catalog-content\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.676615 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:47.176583698 +0000 UTC m=+224.139099498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.723850 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.777997 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l28wf\" (UniqueName: \"kubernetes.io/projected/6039db56-150f-48d0-84a3-511c6a67daad-kube-api-access-l28wf\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.778316 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-utilities\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.778349 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-catalog-content\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.778377 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.778656 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:47.278645264 +0000 UTC m=+224.241161064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.778769 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-utilities\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.779060 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-catalog-content\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.795429 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9gg9"] Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.796960 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l28wf\" (UniqueName: \"kubernetes.io/projected/6039db56-150f-48d0-84a3-511c6a67daad-kube-api-access-l28wf\") pod \"community-operators-khpjc\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.797537 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.804786 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9gg9"] Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.865867 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qtlz2"] Mar 20 15:26:46 crc kubenswrapper[4779]: W0320 15:26:46.876602 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf20db228_a34a_4734_ae22_53cd86de06ed.slice/crio-dc4ff41d4082d44bab2c4bcbb918f9d70361aa6e9326164e67e076c7be0f07a4 WatchSource:0}: Error finding container dc4ff41d4082d44bab2c4bcbb918f9d70361aa6e9326164e67e076c7be0f07a4: Status 404 returned error can't find the container with id dc4ff41d4082d44bab2c4bcbb918f9d70361aa6e9326164e67e076c7be0f07a4 Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.878751 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.878963 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-catalog-content\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.879056 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldv4\" (UniqueName: \"kubernetes.io/projected/626d4bed-62b8-47e0-9599-1c6f949a3d22-kube-api-access-jldv4\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.879137 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-utilities\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.879239 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:26:47.379214031 +0000 UTC m=+224.341729891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.931536 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.979902 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.979954 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-utilities\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.980005 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-catalog-content\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.980096 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jldv4\" (UniqueName: \"kubernetes.io/projected/626d4bed-62b8-47e0-9599-1c6f949a3d22-kube-api-access-jldv4\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: E0320 15:26:46.980372 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:26:47.480357594 +0000 UTC m=+224.442873394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tx4fd" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.980820 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-utilities\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.980962 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-catalog-content\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:46 crc kubenswrapper[4779]: I0320 15:26:46.991577 4779 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.016087 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldv4\" (UniqueName: \"kubernetes.io/projected/626d4bed-62b8-47e0-9599-1c6f949a3d22-kube-api-access-jldv4\") pod \"certified-operators-h9gg9\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.024801 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:47 crc kubenswrapper[4779]: [-]has-synced failed: reason withheld Mar 20 15:26:47 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:47 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.024925 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.048150 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhgrq"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.050630 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtlz2" event={"ID":"f20db228-a34a-4734-ae22-53cd86de06ed","Type":"ContainerStarted","Data":"a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce"} Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.050674 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtlz2" event={"ID":"f20db228-a34a-4734-ae22-53cd86de06ed","Type":"ContainerStarted","Data":"dc4ff41d4082d44bab2c4bcbb918f9d70361aa6e9326164e67e076c7be0f07a4"} Mar 20 15:26:47 crc kubenswrapper[4779]: W0320 15:26:47.053711 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57ad9993_5b81_4487_8f70_37e41aca1678.slice/crio-8bb1200bac34434d2da5a2465ff8408513d7170e026189d72e473f6b410a4c5d WatchSource:0}: Error finding container 8bb1200bac34434d2da5a2465ff8408513d7170e026189d72e473f6b410a4c5d: Status 404 returned error can't find the container with id 8bb1200bac34434d2da5a2465ff8408513d7170e026189d72e473f6b410a4c5d Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.055280 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.056206 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-spxhw" event={"ID":"b46569bb-4450-4b94-8615-e8c4a3afd495","Type":"ContainerDied","Data":"3c1485930d0e83e60eb1bc9d28794ee620e0f74c568d5b0145438aae016e9fca"} Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.056296 4779 scope.go:117] "RemoveContainer" containerID="33b8d0f5c77fa808e6324ad1886105c853ba85569313c6c6ffcc6bf1c9660eeb" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.060736 4779 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T15:26:46.991605627Z","Handler":null,"Name":""} Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.065632 4779 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.065717 4779 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.065725 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" event={"ID":"f465ee1c-98d0-414f-b53c-4203e5bf29af","Type":"ContainerStarted","Data":"0d714d7dd6ee1cb290c19e42e97cffc6a22a980f41aab3f2e44ceccb77a280fe"} Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.077153 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" event={"ID":"84ee4d33-4d04-496b-b95c-61db87d00cdc","Type":"ContainerDied","Data":"9b747139aab949268a4d56f7231b6bb82ee5cef1b732d463d532b31144fae5a5"} Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.077291 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b747139aab949268a4d56f7231b6bb82ee5cef1b732d463d532b31144fae5a5" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.077372 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.084138 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.086927 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt" event={"ID":"57d12985-7d5e-4c20-9b2c-9790d454fc4b","Type":"ContainerDied","Data":"58e74ba6a9adfca929c256366d235cb0cceebf741602481e9465f8f701b06c42"} Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.087411 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.099023 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.125447 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.190221 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.196694 4779 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.196730 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.232055 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-spxhw"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.237126 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tx4fd\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.243565 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-spxhw"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.244656 4779 scope.go:117] "RemoveContainer" containerID="6989c6453c60f53174b0fc78917974b2d9bd4a20f72076deec02528dfb51a6ac" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.246139 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.249205 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h46dt"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.383177 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.384016 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.384968 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54c8945948-686wn"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.385605 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.390801 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.391626 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.391897 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.393100 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.393326 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.393445 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.393505 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.393598 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.393616 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.393628 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.393801 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.399584 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.407672 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.414700 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c8945948-686wn"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.425691 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.440003 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9gg9"] Mar 20 15:26:47 crc kubenswrapper[4779]: W0320 15:26:47.447077 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod626d4bed_62b8_47e0_9599_1c6f949a3d22.slice/crio-fd27923b763297b68441ad63f2aff2c9d7d858b01308f390df575ab51ac585c5 WatchSource:0}: Error finding container fd27923b763297b68441ad63f2aff2c9d7d858b01308f390df575ab51ac585c5: Status 404 returned error can't find the container with id fd27923b763297b68441ad63f2aff2c9d7d858b01308f390df575ab51ac585c5 Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.469310 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khpjc"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.495909 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-config\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.495979 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7sx5\" (UniqueName: \"kubernetes.io/projected/750be46e-8f16-4777-943c-b84bcde78811-kube-api-access-q7sx5\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.496000 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/603311c0-d50e-4198-a880-2f934c57c2a6-serving-cert\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.496031 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-config\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.496067 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-proxy-ca-bundles\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.496080 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-client-ca\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.496097 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7r6\" (UniqueName: \"kubernetes.io/projected/603311c0-d50e-4198-a880-2f934c57c2a6-kube-api-access-gl7r6\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.496139 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750be46e-8f16-4777-943c-b84bcde78811-serving-cert\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.496161 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-client-ca\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.505973 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:47 crc kubenswrapper[4779]: W0320 15:26:47.508241 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6039db56_150f_48d0_84a3_511c6a67daad.slice/crio-e4ffdf29691847c9a1697ed0a2920818c09d6ff0c415cdd2b4ee07c15ae21ee2 WatchSource:0}: Error finding container e4ffdf29691847c9a1697ed0a2920818c09d6ff0c415cdd2b4ee07c15ae21ee2: Status 404 returned error can't find the container with id e4ffdf29691847c9a1697ed0a2920818c09d6ff0c415cdd2b4ee07c15ae21ee2 Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599562 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-client-ca\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599647 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-config\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599697 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7sx5\" (UniqueName: \"kubernetes.io/projected/750be46e-8f16-4777-943c-b84bcde78811-kube-api-access-q7sx5\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599720 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/603311c0-d50e-4198-a880-2f934c57c2a6-serving-cert\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599768 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-config\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599823 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-proxy-ca-bundles\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599852 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-client-ca\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599878 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7r6\" (UniqueName: \"kubernetes.io/projected/603311c0-d50e-4198-a880-2f934c57c2a6-kube-api-access-gl7r6\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.599917 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750be46e-8f16-4777-943c-b84bcde78811-serving-cert\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.602663 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-proxy-ca-bundles\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.602691 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-config\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.602737 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-client-ca\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.603066 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-client-ca\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.603656 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-config\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.610172 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750be46e-8f16-4777-943c-b84bcde78811-serving-cert\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.613458 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/603311c0-d50e-4198-a880-2f934c57c2a6-serving-cert\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.619227 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7sx5\" (UniqueName: \"kubernetes.io/projected/750be46e-8f16-4777-943c-b84bcde78811-kube-api-access-q7sx5\") pod \"controller-manager-54c8945948-686wn\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.620290 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7r6\" (UniqueName: \"kubernetes.io/projected/603311c0-d50e-4198-a880-2f934c57c2a6-kube-api-access-gl7r6\") pod \"route-controller-manager-7d577fc998-w645j\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.705909 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tx4fd"] Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.712541 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.732756 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:47 crc kubenswrapper[4779]: W0320 15:26:47.742755 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod424710af_f3c1_4cd2_9072_ae3cd248895d.slice/crio-46f510fc45ec9fc2f4ca355d768a7029239ec5d4b898f93ac21562deb186ca33 WatchSource:0}: Error finding container 46f510fc45ec9fc2f4ca355d768a7029239ec5d4b898f93ac21562deb186ca33: Status 404 returned error can't find the container with id 46f510fc45ec9fc2f4ca355d768a7029239ec5d4b898f93ac21562deb186ca33 Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.822775 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d12985-7d5e-4c20-9b2c-9790d454fc4b" path="/var/lib/kubelet/pods/57d12985-7d5e-4c20-9b2c-9790d454fc4b/volumes" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.823656 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 15:26:47 crc kubenswrapper[4779]: I0320 15:26:47.825302 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46569bb-4450-4b94-8615-e8c4a3afd495" path="/var/lib/kubelet/pods/b46569bb-4450-4b94-8615-e8c4a3afd495/volumes" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.020740 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:48 crc kubenswrapper[4779]: [-]has-synced failed: reason withheld Mar 20 15:26:48 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:48 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.020797 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.091146 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" event={"ID":"424710af-f3c1-4cd2-9072-ae3cd248895d","Type":"ContainerStarted","Data":"dafe4701414127f20cd6d3454dbfafc466f734dff061dc8770414601ad5d2380"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.091204 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" event={"ID":"424710af-f3c1-4cd2-9072-ae3cd248895d","Type":"ContainerStarted","Data":"46f510fc45ec9fc2f4ca355d768a7029239ec5d4b898f93ac21562deb186ca33"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.091248 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.092566 4779 generic.go:334] "Generic (PLEG): container finished" podID="57ad9993-5b81-4487-8f70-37e41aca1678" containerID="c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b" exitCode=0 Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.093022 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhgrq" event={"ID":"57ad9993-5b81-4487-8f70-37e41aca1678","Type":"ContainerDied","Data":"c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.093052 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhgrq" event={"ID":"57ad9993-5b81-4487-8f70-37e41aca1678","Type":"ContainerStarted","Data":"8bb1200bac34434d2da5a2465ff8408513d7170e026189d72e473f6b410a4c5d"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.096176 4779 generic.go:334] "Generic (PLEG): container finished" podID="f20db228-a34a-4734-ae22-53cd86de06ed" containerID="a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce" exitCode=0 Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.096263 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtlz2" event={"ID":"f20db228-a34a-4734-ae22-53cd86de06ed","Type":"ContainerDied","Data":"a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.100609 4779 generic.go:334] "Generic (PLEG): container finished" podID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerID="91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396" exitCode=0 Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.100994 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gg9" event={"ID":"626d4bed-62b8-47e0-9599-1c6f949a3d22","Type":"ContainerDied","Data":"91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.101054 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gg9" event={"ID":"626d4bed-62b8-47e0-9599-1c6f949a3d22","Type":"ContainerStarted","Data":"fd27923b763297b68441ad63f2aff2c9d7d858b01308f390df575ab51ac585c5"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.113010 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" podStartSLOduration=179.112992787 podStartE2EDuration="2m59.112992787s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:48.10837266 +0000 UTC m=+225.070888470" watchObservedRunningTime="2026-03-20 15:26:48.112992787 +0000 UTC m=+225.075508587" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.134528 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" event={"ID":"f465ee1c-98d0-414f-b53c-4203e5bf29af","Type":"ContainerStarted","Data":"e9841d11dca6ca282371d369832333d3c2d785af1c3b1c66fe6a748e697f96c2"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.134847 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" event={"ID":"f465ee1c-98d0-414f-b53c-4203e5bf29af","Type":"ContainerStarted","Data":"505d922c8a420c33d60b2af6e582ebea3dc277c0850e1f00dd13e703e8cbcdcb"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.137506 4779 generic.go:334] "Generic (PLEG): container finished" podID="6039db56-150f-48d0-84a3-511c6a67daad" containerID="2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce" exitCode=0 Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.137610 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khpjc" event={"ID":"6039db56-150f-48d0-84a3-511c6a67daad","Type":"ContainerDied","Data":"2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.137634 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khpjc" event={"ID":"6039db56-150f-48d0-84a3-511c6a67daad","Type":"ContainerStarted","Data":"e4ffdf29691847c9a1697ed0a2920818c09d6ff0c415cdd2b4ee07c15ae21ee2"} Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.185826 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c8945948-686wn"] Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.192723 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j"] Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.230590 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.231040 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7fhq9" podStartSLOduration=11.231027874 podStartE2EDuration="11.231027874s" podCreationTimestamp="2026-03-20 15:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:48.230526452 +0000 UTC m=+225.193042252" watchObservedRunningTime="2026-03-20 15:26:48.231027874 +0000 UTC m=+225.193543674" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.231789 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.248050 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.361225 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.361869 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.366482 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.369498 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.369843 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.404469 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4fphd"] Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.411616 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.411688 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.412743 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.423343 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.428987 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fphd"] Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.512847 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s892x\" (UniqueName: \"kubernetes.io/projected/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-kube-api-access-s892x\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.512955 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.512993 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.513026 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-catalog-content\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.513078 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-utilities\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.513133 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.542536 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.614059 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s892x\" (UniqueName: \"kubernetes.io/projected/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-kube-api-access-s892x\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.614216 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-catalog-content\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.614269 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-utilities\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.614972 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-catalog-content\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.615216 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-utilities\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.632260 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s892x\" (UniqueName: \"kubernetes.io/projected/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-kube-api-access-s892x\") pod \"redhat-marketplace-4fphd\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.697947 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.742667 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.789787 4779 ???:1] "http: TLS handshake error from 192.168.126.11:50336: no serving certificate available for the kubelet" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.795988 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-25rn6"] Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.797264 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.807275 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-25rn6"] Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.918284 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvx2\" (UniqueName: \"kubernetes.io/projected/442385a1-f11d-4412-8e24-3d6e49ec4930-kube-api-access-xpvx2\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.918349 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-utilities\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:48 crc kubenswrapper[4779]: I0320 15:26:48.918483 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-catalog-content\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.019800 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-catalog-content\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.020079 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvx2\" (UniqueName: \"kubernetes.io/projected/442385a1-f11d-4412-8e24-3d6e49ec4930-kube-api-access-xpvx2\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.020125 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-utilities\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.022521 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:49 crc kubenswrapper[4779]: [-]has-synced failed: reason withheld Mar 20 15:26:49 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:49 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.022583 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.027989 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-catalog-content\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.028617 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-utilities\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.040335 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.074251 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvx2\" (UniqueName: \"kubernetes.io/projected/442385a1-f11d-4412-8e24-3d6e49ec4930-kube-api-access-xpvx2\") pod \"redhat-marketplace-25rn6\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.132392 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.148822 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.149486 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.153292 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.153389 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.164471 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.177015 4779 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjjfz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.177065 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qjjfz" podUID="e52093ea-9241-43c1-ae08-d9e87beed327" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.177660 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d","Type":"ContainerStarted","Data":"d787967ac37cb19bb19f4aec180371fdea570089f3aaeba72c5dfbe31c128259"} Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.177826 4779 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjjfz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.177866 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjjfz" podUID="e52093ea-9241-43c1-ae08-d9e87beed327" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.186720 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fphd"] Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.186769 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.186781 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" event={"ID":"603311c0-d50e-4198-a880-2f934c57c2a6","Type":"ContainerStarted","Data":"51a438aa9944f568ac7ca951dfd63f82b0d5e260fb8da4cdddf1baa2bf74b9c0"} Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.186795 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" event={"ID":"603311c0-d50e-4198-a880-2f934c57c2a6","Type":"ContainerStarted","Data":"3f91ee79c08139306b819f21f1628127384fb84cb1b8084c84b554fb537b65dc"} Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.201249 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.213845 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" podStartSLOduration=4.21382294 podStartE2EDuration="4.21382294s" podCreationTimestamp="2026-03-20 15:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:49.203270534 +0000 UTC m=+226.165786354" watchObservedRunningTime="2026-03-20 15:26:49.21382294 +0000 UTC m=+226.176338740" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.226567 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.226653 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.238550 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-btkk5" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.242618 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" event={"ID":"750be46e-8f16-4777-943c-b84bcde78811","Type":"ContainerStarted","Data":"cea05ff4749760ae30a129102730665ce127595a86d2088e89f4129f2309804c"} Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.242665 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" event={"ID":"750be46e-8f16-4777-943c-b84bcde78811","Type":"ContainerStarted","Data":"e24f8b437eb2dd1d85681eba74fd2a78e367de9485fce76078942497bc12ec82"} Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.244733 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.253325 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.258552 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-sjbmt" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.321688 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" podStartSLOduration=4.321664311 podStartE2EDuration="4.321664311s" podCreationTimestamp="2026-03-20 15:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:49.288555238 +0000 UTC m=+226.251071058" watchObservedRunningTime="2026-03-20 15:26:49.321664311 +0000 UTC m=+226.284180111" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.328607 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.328681 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.334933 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.360082 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.413672 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tqlmj"] Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.414799 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.416614 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tqlmj"] Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.420858 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.483696 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.538420 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlqhf\" (UniqueName: \"kubernetes.io/projected/437c3121-8b7a-48b1-b805-540a41e89b6a-kube-api-access-jlqhf\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.538498 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-catalog-content\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.538571 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-utilities\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.643949 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlqhf\" (UniqueName: \"kubernetes.io/projected/437c3121-8b7a-48b1-b805-540a41e89b6a-kube-api-access-jlqhf\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.644030 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-catalog-content\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.644171 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-utilities\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.644920 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-catalog-content\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.645065 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-utilities\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.663344 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlqhf\" (UniqueName: \"kubernetes.io/projected/437c3121-8b7a-48b1-b805-540a41e89b6a-kube-api-access-jlqhf\") pod \"redhat-operators-tqlmj\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.756784 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.803799 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vmgpk"] Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.805942 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.846597 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc7xm\" (UniqueName: \"kubernetes.io/projected/cef14e53-c107-4bb0-85bc-40b8189fd346-kube-api-access-vc7xm\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.846650 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-catalog-content\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.846736 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-utilities\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.849321 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-25rn6"] Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.849347 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vmgpk"] Mar 20 15:26:49 crc kubenswrapper[4779]: W0320 15:26:49.859358 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod442385a1_f11d_4412_8e24_3d6e49ec4930.slice/crio-60a4ba918c31500ddc9916c058243c44c65e1386ef74121fc9f702fb8453c9d6 WatchSource:0}: Error finding container 60a4ba918c31500ddc9916c058243c44c65e1386ef74121fc9f702fb8453c9d6: Status 404 returned error can't find the container with id 60a4ba918c31500ddc9916c058243c44c65e1386ef74121fc9f702fb8453c9d6 Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.945719 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.945814 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.947383 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc7xm\" (UniqueName: \"kubernetes.io/projected/cef14e53-c107-4bb0-85bc-40b8189fd346-kube-api-access-vc7xm\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.947433 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-catalog-content\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.947521 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-utilities\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.947915 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-utilities\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.948614 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-catalog-content\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.948845 4779 patch_prober.go:28] interesting pod/console-f9d7485db-sl2tf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.948900 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sl2tf" podUID="163fdaa3-a29a-44bb-9da0-97b18da1c2ba" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 20 15:26:49 crc kubenswrapper[4779]: I0320 15:26:49.983089 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc7xm\" (UniqueName: \"kubernetes.io/projected/cef14e53-c107-4bb0-85bc-40b8189fd346-kube-api-access-vc7xm\") pod \"redhat-operators-vmgpk\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.018162 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.023067 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:50 crc kubenswrapper[4779]: [-]has-synced failed: reason withheld Mar 20 15:26:50 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:50 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.023150 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.118031 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.148100 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.286889 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d","Type":"ContainerStarted","Data":"5c37b9addf9e188e1933b6b340014a0c48091cbeba94e30d57a519bbc580d2d8"} Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.294346 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6ef690f9-d7f9-439e-b1da-46c6dd17da15","Type":"ContainerStarted","Data":"ea53f2c7c524a04012bc060f8be7b035458f56168c188f844da650ea4434f16d"} Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.297187 4779 generic.go:334] "Generic (PLEG): container finished" podID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerID="89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06" exitCode=0 Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.297254 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fphd" event={"ID":"23f0a6ab-6ded-4310-82ff-98ea45fa3a43","Type":"ContainerDied","Data":"89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06"} Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.297279 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fphd" event={"ID":"23f0a6ab-6ded-4310-82ff-98ea45fa3a43","Type":"ContainerStarted","Data":"89b0cef7d5f67595008027bdd285a4ef0304697e00e955ca1b4ea04e08a67f28"} Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.320475 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25rn6" event={"ID":"442385a1-f11d-4412-8e24-3d6e49ec4930","Type":"ContainerStarted","Data":"c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459"} Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.320510 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25rn6" event={"ID":"442385a1-f11d-4412-8e24-3d6e49ec4930","Type":"ContainerStarted","Data":"60a4ba918c31500ddc9916c058243c44c65e1386ef74121fc9f702fb8453c9d6"} Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.415757 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.415741894 podStartE2EDuration="2.415741894s" podCreationTimestamp="2026-03-20 15:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:26:50.308525009 +0000 UTC m=+227.271040829" watchObservedRunningTime="2026-03-20 15:26:50.415741894 +0000 UTC m=+227.378257684" Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.454630 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tqlmj"] Mar 20 15:26:50 crc kubenswrapper[4779]: I0320 15:26:50.764370 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vmgpk"] Mar 20 15:26:51 crc kubenswrapper[4779]: I0320 15:26:51.002732 4779 ???:1] "http: TLS handshake error from 192.168.126.11:50338: no serving certificate available for the kubelet" Mar 20 15:26:51 crc kubenswrapper[4779]: I0320 15:26:51.026593 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:51 crc kubenswrapper[4779]: [-]has-synced failed: reason withheld Mar 20 15:26:51 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:51 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:51 crc kubenswrapper[4779]: I0320 15:26:51.026665 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:51 crc kubenswrapper[4779]: I0320 15:26:51.331229 4779 generic.go:334] "Generic (PLEG): container finished" podID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerID="c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459" exitCode=0 Mar 20 15:26:51 crc kubenswrapper[4779]: I0320 15:26:51.331293 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25rn6" event={"ID":"442385a1-f11d-4412-8e24-3d6e49ec4930","Type":"ContainerDied","Data":"c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459"} Mar 20 15:26:51 crc kubenswrapper[4779]: I0320 15:26:51.335773 4779 generic.go:334] "Generic (PLEG): container finished" podID="33dc9ae1-4170-4306-a3a9-0fb6a498bc8d" containerID="5c37b9addf9e188e1933b6b340014a0c48091cbeba94e30d57a519bbc580d2d8" exitCode=0 Mar 20 15:26:51 crc kubenswrapper[4779]: I0320 15:26:51.335849 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d","Type":"ContainerDied","Data":"5c37b9addf9e188e1933b6b340014a0c48091cbeba94e30d57a519bbc580d2d8"} Mar 20 15:26:51 crc kubenswrapper[4779]: I0320 15:26:51.338419 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6ef690f9-d7f9-439e-b1da-46c6dd17da15","Type":"ContainerStarted","Data":"885b53b4018ce5ed81f19421b72a61e21012fdc83021c5c23d6a250ee2f09a36"} Mar 20 15:26:52 crc kubenswrapper[4779]: I0320 15:26:52.021190 4779 patch_prober.go:28] interesting pod/router-default-5444994796-t8tpv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 15:26:52 crc kubenswrapper[4779]: [+]has-synced ok Mar 20 15:26:52 crc kubenswrapper[4779]: [+]process-running ok Mar 20 15:26:52 crc kubenswrapper[4779]: healthz check failed Mar 20 15:26:52 crc kubenswrapper[4779]: I0320 15:26:52.021284 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t8tpv" podUID="b91d4316-5211-4517-8193-b8b6100f21fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 15:26:52 crc kubenswrapper[4779]: I0320 15:26:52.394950 4779 generic.go:334] "Generic (PLEG): container finished" podID="6ef690f9-d7f9-439e-b1da-46c6dd17da15" containerID="885b53b4018ce5ed81f19421b72a61e21012fdc83021c5c23d6a250ee2f09a36" exitCode=0 Mar 20 15:26:52 crc kubenswrapper[4779]: I0320 15:26:52.395094 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6ef690f9-d7f9-439e-b1da-46c6dd17da15","Type":"ContainerDied","Data":"885b53b4018ce5ed81f19421b72a61e21012fdc83021c5c23d6a250ee2f09a36"} Mar 20 15:26:53 crc kubenswrapper[4779]: I0320 15:26:53.030040 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:53 crc kubenswrapper[4779]: I0320 15:26:53.035010 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-t8tpv" Mar 20 15:26:53 crc kubenswrapper[4779]: I0320 15:26:53.940703 4779 ???:1] "http: TLS handshake error from 192.168.126.11:36960: no serving certificate available for the kubelet" Mar 20 15:26:55 crc kubenswrapper[4779]: I0320 15:26:55.154429 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:26:55 crc kubenswrapper[4779]: I0320 15:26:55.154477 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:26:55 crc kubenswrapper[4779]: I0320 15:26:55.197454 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:26:55 crc kubenswrapper[4779]: I0320 15:26:55.205382 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n67zh" Mar 20 15:26:58 crc kubenswrapper[4779]: W0320 15:26:58.561381 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcef14e53_c107_4bb0_85bc_40b8189fd346.slice/crio-6d01c127b86ab0150c7d3704c209c5d86a1f19658ab2a86666da82a803d4e164 WatchSource:0}: Error finding container 6d01c127b86ab0150c7d3704c209c5d86a1f19658ab2a86666da82a803d4e164: Status 404 returned error can't find the container with id 6d01c127b86ab0150c7d3704c209c5d86a1f19658ab2a86666da82a803d4e164 Mar 20 15:26:58 crc kubenswrapper[4779]: W0320 15:26:58.578297 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437c3121_8b7a_48b1_b805_540a41e89b6a.slice/crio-5f5d53b386641d115a5646c5216a4aa9076519ffef47dec0b3690788b1312e77 WatchSource:0}: Error finding container 5f5d53b386641d115a5646c5216a4aa9076519ffef47dec0b3690788b1312e77: Status 404 returned error can't find the container with id 5f5d53b386641d115a5646c5216a4aa9076519ffef47dec0b3690788b1312e77 Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.622374 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.626022 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.725969 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kubelet-dir\") pod \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\" (UID: \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\") " Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.726036 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kube-api-access\") pod \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\" (UID: \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\") " Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.726078 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kube-api-access\") pod \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\" (UID: \"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d\") " Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.726157 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kubelet-dir\") pod \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\" (UID: \"6ef690f9-d7f9-439e-b1da-46c6dd17da15\") " Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.726783 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "33dc9ae1-4170-4306-a3a9-0fb6a498bc8d" (UID: "33dc9ae1-4170-4306-a3a9-0fb6a498bc8d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.726885 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6ef690f9-d7f9-439e-b1da-46c6dd17da15" (UID: "6ef690f9-d7f9-439e-b1da-46c6dd17da15"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.727486 4779 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.727527 4779 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.731482 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "33dc9ae1-4170-4306-a3a9-0fb6a498bc8d" (UID: "33dc9ae1-4170-4306-a3a9-0fb6a498bc8d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.731863 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6ef690f9-d7f9-439e-b1da-46c6dd17da15" (UID: "6ef690f9-d7f9-439e-b1da-46c6dd17da15"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.828501 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ef690f9-d7f9-439e-b1da-46c6dd17da15-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:58 crc kubenswrapper[4779]: I0320 15:26:58.828816 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33dc9ae1-4170-4306-a3a9-0fb6a498bc8d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.194489 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qjjfz" Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.453344 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqlmj" event={"ID":"437c3121-8b7a-48b1-b805-540a41e89b6a","Type":"ContainerStarted","Data":"5f5d53b386641d115a5646c5216a4aa9076519ffef47dec0b3690788b1312e77"} Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.455766 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"33dc9ae1-4170-4306-a3a9-0fb6a498bc8d","Type":"ContainerDied","Data":"d787967ac37cb19bb19f4aec180371fdea570089f3aaeba72c5dfbe31c128259"} Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.455791 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d787967ac37cb19bb19f4aec180371fdea570089f3aaeba72c5dfbe31c128259" Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.455861 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.462780 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6ef690f9-d7f9-439e-b1da-46c6dd17da15","Type":"ContainerDied","Data":"ea53f2c7c524a04012bc060f8be7b035458f56168c188f844da650ea4434f16d"} Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.462811 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea53f2c7c524a04012bc060f8be7b035458f56168c188f844da650ea4434f16d" Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.462870 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.464062 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmgpk" event={"ID":"cef14e53-c107-4bb0-85bc-40b8189fd346","Type":"ContainerStarted","Data":"6d01c127b86ab0150c7d3704c209c5d86a1f19658ab2a86666da82a803d4e164"} Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.954266 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:26:59 crc kubenswrapper[4779]: I0320 15:26:59.961090 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:27:04 crc kubenswrapper[4779]: E0320 15:27:04.374684 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 15:27:04 crc kubenswrapper[4779]: E0320 15:27:04.375359 4779 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 15:27:04 crc kubenswrapper[4779]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 15:27:04 crc kubenswrapper[4779]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4j5kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567006-jlkgl_openshift-infra(c3cf60a6-49cd-43e9-982b-4673db42fde6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 15:27:04 crc kubenswrapper[4779]: > logger="UnhandledError" Mar 20 15:27:04 crc kubenswrapper[4779]: E0320 15:27:04.376560 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" podUID="c3cf60a6-49cd-43e9-982b-4673db42fde6" Mar 20 15:27:04 crc kubenswrapper[4779]: E0320 15:27:04.496234 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" podUID="c3cf60a6-49cd-43e9-982b-4673db42fde6" Mar 20 15:27:04 crc kubenswrapper[4779]: I0320 15:27:04.962400 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54c8945948-686wn"] Mar 20 15:27:04 crc kubenswrapper[4779]: I0320 15:27:04.962604 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" podUID="750be46e-8f16-4777-943c-b84bcde78811" containerName="controller-manager" containerID="cri-o://cea05ff4749760ae30a129102730665ce127595a86d2088e89f4129f2309804c" gracePeriod=30 Mar 20 15:27:04 crc kubenswrapper[4779]: I0320 15:27:04.975564 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j"] Mar 20 15:27:04 crc kubenswrapper[4779]: I0320 15:27:04.976201 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" podUID="603311c0-d50e-4198-a880-2f934c57c2a6" containerName="route-controller-manager" containerID="cri-o://51a438aa9944f568ac7ca951dfd63f82b0d5e260fb8da4cdddf1baa2bf74b9c0" gracePeriod=30 Mar 20 15:27:06 crc kubenswrapper[4779]: I0320 15:27:06.504624 4779 generic.go:334] "Generic (PLEG): container finished" podID="603311c0-d50e-4198-a880-2f934c57c2a6" containerID="51a438aa9944f568ac7ca951dfd63f82b0d5e260fb8da4cdddf1baa2bf74b9c0" exitCode=0 Mar 20 15:27:06 crc kubenswrapper[4779]: I0320 15:27:06.504722 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" event={"ID":"603311c0-d50e-4198-a880-2f934c57c2a6","Type":"ContainerDied","Data":"51a438aa9944f568ac7ca951dfd63f82b0d5e260fb8da4cdddf1baa2bf74b9c0"} Mar 20 15:27:06 crc kubenswrapper[4779]: I0320 15:27:06.506520 4779 generic.go:334] "Generic (PLEG): container finished" podID="750be46e-8f16-4777-943c-b84bcde78811" containerID="cea05ff4749760ae30a129102730665ce127595a86d2088e89f4129f2309804c" exitCode=0 Mar 20 15:27:06 crc kubenswrapper[4779]: I0320 15:27:06.506562 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" event={"ID":"750be46e-8f16-4777-943c-b84bcde78811","Type":"ContainerDied","Data":"cea05ff4749760ae30a129102730665ce127595a86d2088e89f4129f2309804c"} Mar 20 15:27:07 crc kubenswrapper[4779]: I0320 15:27:07.510986 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:27:07 crc kubenswrapper[4779]: I0320 15:27:07.713664 4779 patch_prober.go:28] interesting pod/route-controller-manager-7d577fc998-w645j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 20 15:27:07 crc kubenswrapper[4779]: I0320 15:27:07.713713 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" podUID="603311c0-d50e-4198-a880-2f934c57c2a6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 20 15:27:07 crc kubenswrapper[4779]: I0320 15:27:07.733849 4779 patch_prober.go:28] interesting pod/controller-manager-54c8945948-686wn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 20 15:27:07 crc kubenswrapper[4779]: I0320 15:27:07.733918 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" podUID="750be46e-8f16-4777-943c-b84bcde78811" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 20 15:27:14 crc kubenswrapper[4779]: I0320 15:27:14.445166 4779 ???:1] "http: TLS handshake error from 192.168.126.11:51460: no serving certificate available for the kubelet" Mar 20 15:27:15 crc kubenswrapper[4779]: E0320 15:27:15.840685 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 15:27:15 crc kubenswrapper[4779]: E0320 15:27:15.840853 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jldv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-h9gg9_openshift-marketplace(626d4bed-62b8-47e0-9599-1c6f949a3d22): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 15:27:15 crc kubenswrapper[4779]: E0320 15:27:15.842005 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-h9gg9" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" Mar 20 15:27:15 crc kubenswrapper[4779]: E0320 15:27:15.886963 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 15:27:15 crc kubenswrapper[4779]: E0320 15:27:15.887313 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhsch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mhgrq_openshift-marketplace(57ad9993-5b81-4487-8f70-37e41aca1678): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 15:27:15 crc kubenswrapper[4779]: E0320 15:27:15.888691 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mhgrq" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.096363 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mhgrq" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.096473 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-h9gg9" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.142819 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.167458 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57876b4876-frtn8"] Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.167670 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef690f9-d7f9-439e-b1da-46c6dd17da15" containerName="pruner" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.167682 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef690f9-d7f9-439e-b1da-46c6dd17da15" containerName="pruner" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.167701 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750be46e-8f16-4777-943c-b84bcde78811" containerName="controller-manager" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.167708 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="750be46e-8f16-4777-943c-b84bcde78811" containerName="controller-manager" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.167716 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33dc9ae1-4170-4306-a3a9-0fb6a498bc8d" containerName="pruner" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.167722 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="33dc9ae1-4170-4306-a3a9-0fb6a498bc8d" containerName="pruner" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.167807 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="33dc9ae1-4170-4306-a3a9-0fb6a498bc8d" containerName="pruner" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.167821 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="750be46e-8f16-4777-943c-b84bcde78811" containerName="controller-manager" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.167827 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef690f9-d7f9-439e-b1da-46c6dd17da15" containerName="pruner" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.168848 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.175588 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57876b4876-frtn8"] Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.183519 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.183642 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l28wf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-khpjc_openshift-marketplace(6039db56-150f-48d0-84a3-511c6a67daad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.184698 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-khpjc" podUID="6039db56-150f-48d0-84a3-511c6a67daad" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.223019 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.223196 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qfpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qtlz2_openshift-marketplace(f20db228-a34a-4734-ae22-53cd86de06ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 15:27:17 crc kubenswrapper[4779]: E0320 15:27:17.224394 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qtlz2" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.292607 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7sx5\" (UniqueName: \"kubernetes.io/projected/750be46e-8f16-4777-943c-b84bcde78811-kube-api-access-q7sx5\") pod \"750be46e-8f16-4777-943c-b84bcde78811\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293003 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750be46e-8f16-4777-943c-b84bcde78811-serving-cert\") pod \"750be46e-8f16-4777-943c-b84bcde78811\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293057 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-client-ca\") pod \"750be46e-8f16-4777-943c-b84bcde78811\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293083 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-proxy-ca-bundles\") pod \"750be46e-8f16-4777-943c-b84bcde78811\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293318 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-config\") pod \"750be46e-8f16-4777-943c-b84bcde78811\" (UID: \"750be46e-8f16-4777-943c-b84bcde78811\") " Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293562 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-client-ca\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293598 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-config\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293656 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4dca838-94ad-4765-8320-92fbd318a6e9-serving-cert\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293661 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "750be46e-8f16-4777-943c-b84bcde78811" (UID: "750be46e-8f16-4777-943c-b84bcde78811"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293676 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-client-ca" (OuterVolumeSpecName: "client-ca") pod "750be46e-8f16-4777-943c-b84bcde78811" (UID: "750be46e-8f16-4777-943c-b84bcde78811"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293863 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-proxy-ca-bundles\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.293966 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4hk\" (UniqueName: \"kubernetes.io/projected/d4dca838-94ad-4765-8320-92fbd318a6e9-kube-api-access-9t4hk\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.294029 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.294042 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.294148 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-config" (OuterVolumeSpecName: "config") pod "750be46e-8f16-4777-943c-b84bcde78811" (UID: "750be46e-8f16-4777-943c-b84bcde78811"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.297665 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750be46e-8f16-4777-943c-b84bcde78811-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "750be46e-8f16-4777-943c-b84bcde78811" (UID: "750be46e-8f16-4777-943c-b84bcde78811"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.299237 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750be46e-8f16-4777-943c-b84bcde78811-kube-api-access-q7sx5" (OuterVolumeSpecName: "kube-api-access-q7sx5") pod "750be46e-8f16-4777-943c-b84bcde78811" (UID: "750be46e-8f16-4777-943c-b84bcde78811"). InnerVolumeSpecName "kube-api-access-q7sx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.395171 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-client-ca\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.395229 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-config\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.395256 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4dca838-94ad-4765-8320-92fbd318a6e9-serving-cert\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.395292 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-proxy-ca-bundles\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.395344 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4hk\" (UniqueName: \"kubernetes.io/projected/d4dca838-94ad-4765-8320-92fbd318a6e9-kube-api-access-9t4hk\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.395380 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750be46e-8f16-4777-943c-b84bcde78811-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.395392 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750be46e-8f16-4777-943c-b84bcde78811-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.395401 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7sx5\" (UniqueName: \"kubernetes.io/projected/750be46e-8f16-4777-943c-b84bcde78811-kube-api-access-q7sx5\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.396077 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-client-ca\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.396990 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-config\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.397145 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-proxy-ca-bundles\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.399787 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4dca838-94ad-4765-8320-92fbd318a6e9-serving-cert\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.418758 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4hk\" (UniqueName: \"kubernetes.io/projected/d4dca838-94ad-4765-8320-92fbd318a6e9-kube-api-access-9t4hk\") pod \"controller-manager-57876b4876-frtn8\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.490047 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.560616 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" event={"ID":"750be46e-8f16-4777-943c-b84bcde78811","Type":"ContainerDied","Data":"e24f8b437eb2dd1d85681eba74fd2a78e367de9485fce76078942497bc12ec82"} Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.560672 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c8945948-686wn" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.560679 4779 scope.go:117] "RemoveContainer" containerID="cea05ff4749760ae30a129102730665ce127595a86d2088e89f4129f2309804c" Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.613606 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54c8945948-686wn"] Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.616572 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54c8945948-686wn"] Mar 20 15:27:17 crc kubenswrapper[4779]: I0320 15:27:17.819082 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750be46e-8f16-4777-943c-b84bcde78811" path="/var/lib/kubelet/pods/750be46e-8f16-4777-943c-b84bcde78811/volumes" Mar 20 15:27:18 crc kubenswrapper[4779]: E0320 15:27:18.571304 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qtlz2" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" Mar 20 15:27:18 crc kubenswrapper[4779]: E0320 15:27:18.571307 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-khpjc" podUID="6039db56-150f-48d0-84a3-511c6a67daad" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.632208 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.713773 4779 patch_prober.go:28] interesting pod/route-controller-manager-7d577fc998-w645j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.713848 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" podUID="603311c0-d50e-4198-a880-2f934c57c2a6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.813944 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-client-ca\") pod \"603311c0-d50e-4198-a880-2f934c57c2a6\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.814325 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-config\") pod \"603311c0-d50e-4198-a880-2f934c57c2a6\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.814385 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl7r6\" (UniqueName: \"kubernetes.io/projected/603311c0-d50e-4198-a880-2f934c57c2a6-kube-api-access-gl7r6\") pod \"603311c0-d50e-4198-a880-2f934c57c2a6\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.814422 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/603311c0-d50e-4198-a880-2f934c57c2a6-serving-cert\") pod \"603311c0-d50e-4198-a880-2f934c57c2a6\" (UID: \"603311c0-d50e-4198-a880-2f934c57c2a6\") " Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.816219 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "603311c0-d50e-4198-a880-2f934c57c2a6" (UID: "603311c0-d50e-4198-a880-2f934c57c2a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.816372 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-config" (OuterVolumeSpecName: "config") pod "603311c0-d50e-4198-a880-2f934c57c2a6" (UID: "603311c0-d50e-4198-a880-2f934c57c2a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.821002 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603311c0-d50e-4198-a880-2f934c57c2a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "603311c0-d50e-4198-a880-2f934c57c2a6" (UID: "603311c0-d50e-4198-a880-2f934c57c2a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.822246 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603311c0-d50e-4198-a880-2f934c57c2a6-kube-api-access-gl7r6" (OuterVolumeSpecName: "kube-api-access-gl7r6") pod "603311c0-d50e-4198-a880-2f934c57c2a6" (UID: "603311c0-d50e-4198-a880-2f934c57c2a6"). InnerVolumeSpecName "kube-api-access-gl7r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.916073 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.916133 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603311c0-d50e-4198-a880-2f934c57c2a6-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.916146 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl7r6\" (UniqueName: \"kubernetes.io/projected/603311c0-d50e-4198-a880-2f934c57c2a6-kube-api-access-gl7r6\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:18 crc kubenswrapper[4779]: I0320 15:27:18.916159 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/603311c0-d50e-4198-a880-2f934c57c2a6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.098598 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57876b4876-frtn8"] Mar 20 15:27:19 crc kubenswrapper[4779]: W0320 15:27:19.166675 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4dca838_94ad_4765_8320_92fbd318a6e9.slice/crio-3bb6a860c11ae3acbbe9d845489be89c0973702f4222f8ffa394133126c3fa5e WatchSource:0}: Error finding container 3bb6a860c11ae3acbbe9d845489be89c0973702f4222f8ffa394133126c3fa5e: Status 404 returned error can't find the container with id 3bb6a860c11ae3acbbe9d845489be89c0973702f4222f8ffa394133126c3fa5e Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.370330 4779 csr.go:261] certificate signing request csr-8rgrg is approved, waiting to be issued Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.377063 4779 csr.go:257] certificate signing request csr-8rgrg is issued Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.402649 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6"] Mar 20 15:27:19 crc kubenswrapper[4779]: E0320 15:27:19.402915 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603311c0-d50e-4198-a880-2f934c57c2a6" containerName="route-controller-manager" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.402941 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="603311c0-d50e-4198-a880-2f934c57c2a6" containerName="route-controller-manager" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.403066 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="603311c0-d50e-4198-a880-2f934c57c2a6" containerName="route-controller-manager" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.403527 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.410945 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6"] Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.522764 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdtm8\" (UniqueName: \"kubernetes.io/projected/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-kube-api-access-hdtm8\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.523202 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-client-ca\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.523282 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-serving-cert\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.523334 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-config\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.571433 4779 generic.go:334] "Generic (PLEG): container finished" podID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerID="f35cf6101b336d0e7cb5d44a7aa0ccfafe5100e7a2b403053e942a94ac990153" exitCode=0 Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.571494 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmgpk" event={"ID":"cef14e53-c107-4bb0-85bc-40b8189fd346","Type":"ContainerDied","Data":"f35cf6101b336d0e7cb5d44a7aa0ccfafe5100e7a2b403053e942a94ac990153"} Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.574132 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" event={"ID":"d4dca838-94ad-4765-8320-92fbd318a6e9","Type":"ContainerStarted","Data":"a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad"} Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.574172 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" event={"ID":"d4dca838-94ad-4765-8320-92fbd318a6e9","Type":"ContainerStarted","Data":"3bb6a860c11ae3acbbe9d845489be89c0973702f4222f8ffa394133126c3fa5e"} Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.574894 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.577026 4779 generic.go:334] "Generic (PLEG): container finished" podID="c3cf60a6-49cd-43e9-982b-4673db42fde6" containerID="c227e950d444ebb9d9effe23ef9b32bb9dd082e5eeff994497f6ec54ff852ffd" exitCode=0 Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.577073 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" event={"ID":"c3cf60a6-49cd-43e9-982b-4673db42fde6","Type":"ContainerDied","Data":"c227e950d444ebb9d9effe23ef9b32bb9dd082e5eeff994497f6ec54ff852ffd"} Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.580677 4779 generic.go:334] "Generic (PLEG): container finished" podID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerID="7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653" exitCode=0 Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.580742 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25rn6" event={"ID":"442385a1-f11d-4412-8e24-3d6e49ec4930","Type":"ContainerDied","Data":"7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653"} Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.582582 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.582579 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j" event={"ID":"603311c0-d50e-4198-a880-2f934c57c2a6","Type":"ContainerDied","Data":"3f91ee79c08139306b819f21f1628127384fb84cb1b8084c84b554fb537b65dc"} Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.582703 4779 scope.go:117] "RemoveContainer" containerID="51a438aa9944f568ac7ca951dfd63f82b0d5e260fb8da4cdddf1baa2bf74b9c0" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.592806 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.593502 4779 generic.go:334] "Generic (PLEG): container finished" podID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerID="857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6" exitCode=0 Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.593565 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqlmj" event={"ID":"437c3121-8b7a-48b1-b805-540a41e89b6a","Type":"ContainerDied","Data":"857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6"} Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.599649 4779 generic.go:334] "Generic (PLEG): container finished" podID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerID="369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d" exitCode=0 Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.599702 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fphd" event={"ID":"23f0a6ab-6ded-4310-82ff-98ea45fa3a43","Type":"ContainerDied","Data":"369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d"} Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.625174 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdtm8\" (UniqueName: \"kubernetes.io/projected/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-kube-api-access-hdtm8\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.625234 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-client-ca\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.625294 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-serving-cert\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.625334 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-config\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.626504 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-client-ca\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.626647 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-config\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.645876 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-serving-cert\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.648596 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdtm8\" (UniqueName: \"kubernetes.io/projected/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-kube-api-access-hdtm8\") pod \"route-controller-manager-76c5fd99f6-2l7j6\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.651951 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" podStartSLOduration=15.651930811 podStartE2EDuration="15.651930811s" podCreationTimestamp="2026-03-20 15:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:27:19.650068185 +0000 UTC m=+256.612583985" watchObservedRunningTime="2026-03-20 15:27:19.651930811 +0000 UTC m=+256.614446611" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.665540 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j"] Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.669731 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d577fc998-w645j"] Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.765674 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.825622 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603311c0-d50e-4198-a880-2f934c57c2a6" path="/var/lib/kubelet/pods/603311c0-d50e-4198-a880-2f934c57c2a6/volumes" Mar 20 15:27:19 crc kubenswrapper[4779]: I0320 15:27:19.975619 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6"] Mar 20 15:27:19 crc kubenswrapper[4779]: W0320 15:27:19.984009 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24412eb_d7b7_44a5_8ffb_2933842ffcfb.slice/crio-0430f7f79286e12390d5e48e7a7d7fc1cc8ee6688568177517f761a6a1827b55 WatchSource:0}: Error finding container 0430f7f79286e12390d5e48e7a7d7fc1cc8ee6688568177517f761a6a1827b55: Status 404 returned error can't find the container with id 0430f7f79286e12390d5e48e7a7d7fc1cc8ee6688568177517f761a6a1827b55 Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.201126 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rq8f8" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.379022 4779 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-26 07:51:27.757532254 +0000 UTC Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.379065 4779 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6016h24m7.378471914s for next certificate rotation Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.606940 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" event={"ID":"c24412eb-d7b7-44a5-8ffb-2933842ffcfb","Type":"ContainerStarted","Data":"0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7"} Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.607311 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" event={"ID":"c24412eb-d7b7-44a5-8ffb-2933842ffcfb","Type":"ContainerStarted","Data":"0430f7f79286e12390d5e48e7a7d7fc1cc8ee6688568177517f761a6a1827b55"} Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.607332 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.610337 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fphd" event={"ID":"23f0a6ab-6ded-4310-82ff-98ea45fa3a43","Type":"ContainerStarted","Data":"29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d"} Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.612645 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.614857 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25rn6" event={"ID":"442385a1-f11d-4412-8e24-3d6e49ec4930","Type":"ContainerStarted","Data":"0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da"} Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.625713 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" podStartSLOduration=16.625696171 podStartE2EDuration="16.625696171s" podCreationTimestamp="2026-03-20 15:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:27:20.62565456 +0000 UTC m=+257.588170360" watchObservedRunningTime="2026-03-20 15:27:20.625696171 +0000 UTC m=+257.588211971" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.653174 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4fphd" podStartSLOduration=11.145031404 podStartE2EDuration="32.65315056s" podCreationTimestamp="2026-03-20 15:26:48 +0000 UTC" firstStartedPulling="2026-03-20 15:26:58.55731027 +0000 UTC m=+235.519826080" lastFinishedPulling="2026-03-20 15:27:20.065429436 +0000 UTC m=+257.027945236" observedRunningTime="2026-03-20 15:27:20.651436727 +0000 UTC m=+257.613952537" watchObservedRunningTime="2026-03-20 15:27:20.65315056 +0000 UTC m=+257.615666370" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.684826 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-25rn6" podStartSLOduration=11.202051657 podStartE2EDuration="32.684810136s" podCreationTimestamp="2026-03-20 15:26:48 +0000 UTC" firstStartedPulling="2026-03-20 15:26:58.557946046 +0000 UTC m=+235.520461846" lastFinishedPulling="2026-03-20 15:27:20.040704525 +0000 UTC m=+257.003220325" observedRunningTime="2026-03-20 15:27:20.681268427 +0000 UTC m=+257.643784227" watchObservedRunningTime="2026-03-20 15:27:20.684810136 +0000 UTC m=+257.647325936" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.881062 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.960044 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 15:27:20 crc kubenswrapper[4779]: E0320 15:27:20.960347 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cf60a6-49cd-43e9-982b-4673db42fde6" containerName="oc" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.960366 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cf60a6-49cd-43e9-982b-4673db42fde6" containerName="oc" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.960489 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3cf60a6-49cd-43e9-982b-4673db42fde6" containerName="oc" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.960929 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.965177 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.967647 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 15:27:20 crc kubenswrapper[4779]: I0320 15:27:20.971013 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.051799 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j5kn\" (UniqueName: \"kubernetes.io/projected/c3cf60a6-49cd-43e9-982b-4673db42fde6-kube-api-access-4j5kn\") pod \"c3cf60a6-49cd-43e9-982b-4673db42fde6\" (UID: \"c3cf60a6-49cd-43e9-982b-4673db42fde6\") " Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.066346 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cf60a6-49cd-43e9-982b-4673db42fde6-kube-api-access-4j5kn" (OuterVolumeSpecName: "kube-api-access-4j5kn") pod "c3cf60a6-49cd-43e9-982b-4673db42fde6" (UID: "c3cf60a6-49cd-43e9-982b-4673db42fde6"). InnerVolumeSpecName "kube-api-access-4j5kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.153478 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a117f89d-88f3-476f-b211-5a03754ddfc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a117f89d-88f3-476f-b211-5a03754ddfc2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.153594 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a117f89d-88f3-476f-b211-5a03754ddfc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a117f89d-88f3-476f-b211-5a03754ddfc2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.153686 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j5kn\" (UniqueName: \"kubernetes.io/projected/c3cf60a6-49cd-43e9-982b-4673db42fde6-kube-api-access-4j5kn\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.254553 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a117f89d-88f3-476f-b211-5a03754ddfc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a117f89d-88f3-476f-b211-5a03754ddfc2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.254691 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a117f89d-88f3-476f-b211-5a03754ddfc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a117f89d-88f3-476f-b211-5a03754ddfc2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.254837 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a117f89d-88f3-476f-b211-5a03754ddfc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a117f89d-88f3-476f-b211-5a03754ddfc2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.278414 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a117f89d-88f3-476f-b211-5a03754ddfc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a117f89d-88f3-476f-b211-5a03754ddfc2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.285725 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.379874 4779 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-17 12:36:06.453013851 +0000 UTC Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.379916 4779 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5805h8m45.073100221s for next certificate rotation Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.627995 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.628169 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567006-jlkgl" event={"ID":"c3cf60a6-49cd-43e9-982b-4673db42fde6","Type":"ContainerDied","Data":"b48aee491b3ed31df43eadf5a3fd0bcb550d8b60c3785c3cbf3a99fa6234ef26"} Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.628352 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b48aee491b3ed31df43eadf5a3fd0bcb550d8b60c3785c3cbf3a99fa6234ef26" Mar 20 15:27:21 crc kubenswrapper[4779]: I0320 15:27:21.735965 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 15:27:21 crc kubenswrapper[4779]: W0320 15:27:21.755674 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda117f89d_88f3_476f_b211_5a03754ddfc2.slice/crio-3ac20c7891be072912d1b2aab87ea52880251e91282b0c3955b15a1f1faaecda WatchSource:0}: Error finding container 3ac20c7891be072912d1b2aab87ea52880251e91282b0c3955b15a1f1faaecda: Status 404 returned error can't find the container with id 3ac20c7891be072912d1b2aab87ea52880251e91282b0c3955b15a1f1faaecda Mar 20 15:27:22 crc kubenswrapper[4779]: I0320 15:27:22.635192 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a117f89d-88f3-476f-b211-5a03754ddfc2","Type":"ContainerStarted","Data":"a43bc141cb8a3a53915cc4dfb1126e64c6a9b5edf0374712cdba41fbd0135f83"} Mar 20 15:27:22 crc kubenswrapper[4779]: I0320 15:27:22.635506 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a117f89d-88f3-476f-b211-5a03754ddfc2","Type":"ContainerStarted","Data":"3ac20c7891be072912d1b2aab87ea52880251e91282b0c3955b15a1f1faaecda"} Mar 20 15:27:22 crc kubenswrapper[4779]: I0320 15:27:22.648334 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.648314755 podStartE2EDuration="2.648314755s" podCreationTimestamp="2026-03-20 15:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:27:22.646956751 +0000 UTC m=+259.609472551" watchObservedRunningTime="2026-03-20 15:27:22.648314755 +0000 UTC m=+259.610830555" Mar 20 15:27:23 crc kubenswrapper[4779]: I0320 15:27:23.640290 4779 generic.go:334] "Generic (PLEG): container finished" podID="a117f89d-88f3-476f-b211-5a03754ddfc2" containerID="a43bc141cb8a3a53915cc4dfb1126e64c6a9b5edf0374712cdba41fbd0135f83" exitCode=0 Mar 20 15:27:23 crc kubenswrapper[4779]: I0320 15:27:23.640357 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a117f89d-88f3-476f-b211-5a03754ddfc2","Type":"ContainerDied","Data":"a43bc141cb8a3a53915cc4dfb1126e64c6a9b5edf0374712cdba41fbd0135f83"} Mar 20 15:27:24 crc kubenswrapper[4779]: I0320 15:27:24.984334 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.017640 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57876b4876-frtn8"] Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.017967 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" podUID="d4dca838-94ad-4765-8320-92fbd318a6e9" containerName="controller-manager" containerID="cri-o://a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad" gracePeriod=30 Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.029091 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a117f89d-88f3-476f-b211-5a03754ddfc2-kube-api-access\") pod \"a117f89d-88f3-476f-b211-5a03754ddfc2\" (UID: \"a117f89d-88f3-476f-b211-5a03754ddfc2\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.029206 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a117f89d-88f3-476f-b211-5a03754ddfc2-kubelet-dir\") pod \"a117f89d-88f3-476f-b211-5a03754ddfc2\" (UID: \"a117f89d-88f3-476f-b211-5a03754ddfc2\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.029280 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a117f89d-88f3-476f-b211-5a03754ddfc2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a117f89d-88f3-476f-b211-5a03754ddfc2" (UID: "a117f89d-88f3-476f-b211-5a03754ddfc2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.029464 4779 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a117f89d-88f3-476f-b211-5a03754ddfc2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.046336 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a117f89d-88f3-476f-b211-5a03754ddfc2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a117f89d-88f3-476f-b211-5a03754ddfc2" (UID: "a117f89d-88f3-476f-b211-5a03754ddfc2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.104329 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6"] Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.104553 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" podUID="c24412eb-d7b7-44a5-8ffb-2933842ffcfb" containerName="route-controller-manager" containerID="cri-o://0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7" gracePeriod=30 Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.130236 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a117f89d-88f3-476f-b211-5a03754ddfc2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.158517 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.158579 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:27:25 crc kubenswrapper[4779]: E0320 15:27:25.217306 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24412eb_d7b7_44a5_8ffb_2933842ffcfb.slice/crio-conmon-0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24412eb_d7b7_44a5_8ffb_2933842ffcfb.slice/crio-0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.526774 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.538124 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-client-ca\") pod \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.538213 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdtm8\" (UniqueName: \"kubernetes.io/projected/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-kube-api-access-hdtm8\") pod \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.538251 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-serving-cert\") pod \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.538299 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-config\") pod \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\" (UID: \"c24412eb-d7b7-44a5-8ffb-2933842ffcfb\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.538968 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-client-ca" (OuterVolumeSpecName: "client-ca") pod "c24412eb-d7b7-44a5-8ffb-2933842ffcfb" (UID: "c24412eb-d7b7-44a5-8ffb-2933842ffcfb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.539134 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-config" (OuterVolumeSpecName: "config") pod "c24412eb-d7b7-44a5-8ffb-2933842ffcfb" (UID: "c24412eb-d7b7-44a5-8ffb-2933842ffcfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.543583 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c24412eb-d7b7-44a5-8ffb-2933842ffcfb" (UID: "c24412eb-d7b7-44a5-8ffb-2933842ffcfb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.546978 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-kube-api-access-hdtm8" (OuterVolumeSpecName: "kube-api-access-hdtm8") pod "c24412eb-d7b7-44a5-8ffb-2933842ffcfb" (UID: "c24412eb-d7b7-44a5-8ffb-2933842ffcfb"). InnerVolumeSpecName "kube-api-access-hdtm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.572959 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.639711 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-config\") pod \"d4dca838-94ad-4765-8320-92fbd318a6e9\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.639756 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-proxy-ca-bundles\") pod \"d4dca838-94ad-4765-8320-92fbd318a6e9\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.639813 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-client-ca\") pod \"d4dca838-94ad-4765-8320-92fbd318a6e9\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.639866 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4dca838-94ad-4765-8320-92fbd318a6e9-serving-cert\") pod \"d4dca838-94ad-4765-8320-92fbd318a6e9\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.639908 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t4hk\" (UniqueName: \"kubernetes.io/projected/d4dca838-94ad-4765-8320-92fbd318a6e9-kube-api-access-9t4hk\") pod \"d4dca838-94ad-4765-8320-92fbd318a6e9\" (UID: \"d4dca838-94ad-4765-8320-92fbd318a6e9\") " Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.640201 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdtm8\" (UniqueName: \"kubernetes.io/projected/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-kube-api-access-hdtm8\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.640222 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.640234 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.640253 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c24412eb-d7b7-44a5-8ffb-2933842ffcfb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.641372 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d4dca838-94ad-4765-8320-92fbd318a6e9" (UID: "d4dca838-94ad-4765-8320-92fbd318a6e9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.642000 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-config" (OuterVolumeSpecName: "config") pod "d4dca838-94ad-4765-8320-92fbd318a6e9" (UID: "d4dca838-94ad-4765-8320-92fbd318a6e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.643179 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dca838-94ad-4765-8320-92fbd318a6e9-kube-api-access-9t4hk" (OuterVolumeSpecName: "kube-api-access-9t4hk") pod "d4dca838-94ad-4765-8320-92fbd318a6e9" (UID: "d4dca838-94ad-4765-8320-92fbd318a6e9"). InnerVolumeSpecName "kube-api-access-9t4hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.644226 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dca838-94ad-4765-8320-92fbd318a6e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4dca838-94ad-4765-8320-92fbd318a6e9" (UID: "d4dca838-94ad-4765-8320-92fbd318a6e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.644490 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4dca838-94ad-4765-8320-92fbd318a6e9" (UID: "d4dca838-94ad-4765-8320-92fbd318a6e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.654953 4779 generic.go:334] "Generic (PLEG): container finished" podID="d4dca838-94ad-4765-8320-92fbd318a6e9" containerID="a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad" exitCode=0 Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.655034 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" event={"ID":"d4dca838-94ad-4765-8320-92fbd318a6e9","Type":"ContainerDied","Data":"a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad"} Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.655065 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" event={"ID":"d4dca838-94ad-4765-8320-92fbd318a6e9","Type":"ContainerDied","Data":"3bb6a860c11ae3acbbe9d845489be89c0973702f4222f8ffa394133126c3fa5e"} Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.655085 4779 scope.go:117] "RemoveContainer" containerID="a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.655196 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57876b4876-frtn8" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.659802 4779 generic.go:334] "Generic (PLEG): container finished" podID="c24412eb-d7b7-44a5-8ffb-2933842ffcfb" containerID="0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7" exitCode=0 Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.659894 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" event={"ID":"c24412eb-d7b7-44a5-8ffb-2933842ffcfb","Type":"ContainerDied","Data":"0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7"} Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.659926 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" event={"ID":"c24412eb-d7b7-44a5-8ffb-2933842ffcfb","Type":"ContainerDied","Data":"0430f7f79286e12390d5e48e7a7d7fc1cc8ee6688568177517f761a6a1827b55"} Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.659950 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.663697 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a117f89d-88f3-476f-b211-5a03754ddfc2","Type":"ContainerDied","Data":"3ac20c7891be072912d1b2aab87ea52880251e91282b0c3955b15a1f1faaecda"} Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.663734 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ac20c7891be072912d1b2aab87ea52880251e91282b0c3955b15a1f1faaecda" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.663807 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.687546 4779 scope.go:117] "RemoveContainer" containerID="a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad" Mar 20 15:27:25 crc kubenswrapper[4779]: E0320 15:27:25.688354 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad\": container with ID starting with a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad not found: ID does not exist" containerID="a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.688393 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad"} err="failed to get container status \"a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad\": rpc error: code = NotFound desc = could not find container \"a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad\": container with ID starting with a166d6261ca4c155997a2c85022b1d126266e50d6d03abdb931e9bf881fa21ad not found: ID does not exist" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.688451 4779 scope.go:117] "RemoveContainer" containerID="0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.692772 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57876b4876-frtn8"] Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.701919 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57876b4876-frtn8"] Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.706831 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6"] Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.710918 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76c5fd99f6-2l7j6"] Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.711246 4779 scope.go:117] "RemoveContainer" containerID="0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7" Mar 20 15:27:25 crc kubenswrapper[4779]: E0320 15:27:25.711704 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7\": container with ID starting with 0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7 not found: ID does not exist" containerID="0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.711733 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7"} err="failed to get container status \"0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7\": rpc error: code = NotFound desc = could not find container \"0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7\": container with ID starting with 0a8447477f86b9ea3d0c5c84cdb16ce263155b4654189e68e6dddcc40e3342b7 not found: ID does not exist" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.742018 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t4hk\" (UniqueName: \"kubernetes.io/projected/d4dca838-94ad-4765-8320-92fbd318a6e9-kube-api-access-9t4hk\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.742069 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.742082 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.742093 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4dca838-94ad-4765-8320-92fbd318a6e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.742180 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4dca838-94ad-4765-8320-92fbd318a6e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.817400 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24412eb-d7b7-44a5-8ffb-2933842ffcfb" path="/var/lib/kubelet/pods/c24412eb-d7b7-44a5-8ffb-2933842ffcfb/volumes" Mar 20 15:27:25 crc kubenswrapper[4779]: I0320 15:27:25.818130 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4dca838-94ad-4765-8320-92fbd318a6e9" path="/var/lib/kubelet/pods/d4dca838-94ad-4765-8320-92fbd318a6e9/volumes" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.417882 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd"] Mar 20 15:27:26 crc kubenswrapper[4779]: E0320 15:27:26.418178 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24412eb-d7b7-44a5-8ffb-2933842ffcfb" containerName="route-controller-manager" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.418192 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24412eb-d7b7-44a5-8ffb-2933842ffcfb" containerName="route-controller-manager" Mar 20 15:27:26 crc kubenswrapper[4779]: E0320 15:27:26.418210 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a117f89d-88f3-476f-b211-5a03754ddfc2" containerName="pruner" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.418216 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="a117f89d-88f3-476f-b211-5a03754ddfc2" containerName="pruner" Mar 20 15:27:26 crc kubenswrapper[4779]: E0320 15:27:26.418229 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dca838-94ad-4765-8320-92fbd318a6e9" containerName="controller-manager" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.418236 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dca838-94ad-4765-8320-92fbd318a6e9" containerName="controller-manager" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.418323 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4dca838-94ad-4765-8320-92fbd318a6e9" containerName="controller-manager" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.418331 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="a117f89d-88f3-476f-b211-5a03754ddfc2" containerName="pruner" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.418341 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24412eb-d7b7-44a5-8ffb-2933842ffcfb" containerName="route-controller-manager" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.420071 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.422885 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.422930 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.423100 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.423252 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.423331 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.423375 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.423889 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s"] Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.424779 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.426837 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd"] Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.427473 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.429241 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.430858 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.431033 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.431189 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.431347 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.432456 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.433252 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s"] Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450016 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981a3e67-301c-4415-ac4d-195543bf3d16-serving-cert\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450058 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-client-ca\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450094 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdbs\" (UniqueName: \"kubernetes.io/projected/981a3e67-301c-4415-ac4d-195543bf3d16-kube-api-access-gfdbs\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450130 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-client-ca\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450149 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-config\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450189 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-proxy-ca-bundles\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450226 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs9xw\" (UniqueName: \"kubernetes.io/projected/edb061cd-aa53-4116-8d58-8dcf73d58d05-kube-api-access-qs9xw\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450246 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb061cd-aa53-4116-8d58-8dcf73d58d05-serving-cert\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.450283 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-config\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551557 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs9xw\" (UniqueName: \"kubernetes.io/projected/edb061cd-aa53-4116-8d58-8dcf73d58d05-kube-api-access-qs9xw\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551610 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb061cd-aa53-4116-8d58-8dcf73d58d05-serving-cert\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551667 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-config\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551691 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981a3e67-301c-4415-ac4d-195543bf3d16-serving-cert\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551717 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-client-ca\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551751 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdbs\" (UniqueName: \"kubernetes.io/projected/981a3e67-301c-4415-ac4d-195543bf3d16-kube-api-access-gfdbs\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551798 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-client-ca\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551817 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-config\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.551856 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-proxy-ca-bundles\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.552792 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-client-ca\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.552806 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-proxy-ca-bundles\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.552956 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-config\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.557613 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-config\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.557920 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-client-ca\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.563973 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb061cd-aa53-4116-8d58-8dcf73d58d05-serving-cert\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.569911 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981a3e67-301c-4415-ac4d-195543bf3d16-serving-cert\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.569962 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdbs\" (UniqueName: \"kubernetes.io/projected/981a3e67-301c-4415-ac4d-195543bf3d16-kube-api-access-gfdbs\") pod \"controller-manager-5bcd9c8bf6-rtbnd\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.570003 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs9xw\" (UniqueName: \"kubernetes.io/projected/edb061cd-aa53-4116-8d58-8dcf73d58d05-kube-api-access-qs9xw\") pod \"route-controller-manager-59fd4869dd-fnj5s\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.743139 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:26 crc kubenswrapper[4779]: I0320 15:27:26.749931 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:27 crc kubenswrapper[4779]: I0320 15:27:27.774734 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h48bp"] Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.351675 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.353086 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.354871 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.356030 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.364311 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.381644 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.381681 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc621f5-0ad3-4351-bd37-082c3846b156-kube-api-access\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.381757 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-var-lock\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.482713 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-var-lock\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.482810 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.482839 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc621f5-0ad3-4351-bd37-082c3846b156-kube-api-access\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.482850 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-var-lock\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.483004 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.501687 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc621f5-0ad3-4351-bd37-082c3846b156-kube-api-access\") pod \"installer-9-crc\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.685215 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.743234 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.743295 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:27:28 crc kubenswrapper[4779]: I0320 15:27:28.892942 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:27:29 crc kubenswrapper[4779]: I0320 15:27:29.133700 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:27:29 crc kubenswrapper[4779]: I0320 15:27:29.133752 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:27:29 crc kubenswrapper[4779]: I0320 15:27:29.171705 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:27:29 crc kubenswrapper[4779]: I0320 15:27:29.765060 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:27:29 crc kubenswrapper[4779]: I0320 15:27:29.765390 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:27:29 crc kubenswrapper[4779]: I0320 15:27:29.874592 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 15:27:29 crc kubenswrapper[4779]: I0320 15:27:29.924245 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd"] Mar 20 15:27:29 crc kubenswrapper[4779]: W0320 15:27:29.931899 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod981a3e67_301c_4415_ac4d_195543bf3d16.slice/crio-7c1c8c3635f7cbed3eccf6d45dac8fa73f7d834b136ee5f731bfcc7996606f00 WatchSource:0}: Error finding container 7c1c8c3635f7cbed3eccf6d45dac8fa73f7d834b136ee5f731bfcc7996606f00: Status 404 returned error can't find the container with id 7c1c8c3635f7cbed3eccf6d45dac8fa73f7d834b136ee5f731bfcc7996606f00 Mar 20 15:27:29 crc kubenswrapper[4779]: I0320 15:27:29.982280 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s"] Mar 20 15:27:30 crc kubenswrapper[4779]: I0320 15:27:30.322481 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-25rn6"] Mar 20 15:27:30 crc kubenswrapper[4779]: I0320 15:27:30.705674 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" event={"ID":"edb061cd-aa53-4116-8d58-8dcf73d58d05","Type":"ContainerStarted","Data":"cabfb6c512c51a70bbc88a477ebc46defd5a93d9642e3cb564f68d07763db302"} Mar 20 15:27:30 crc kubenswrapper[4779]: I0320 15:27:30.707549 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmgpk" event={"ID":"cef14e53-c107-4bb0-85bc-40b8189fd346","Type":"ContainerStarted","Data":"f8f46bb86240065bb1cb0d10530da1e591e113375bda52618382ceaa4b54eca2"} Mar 20 15:27:30 crc kubenswrapper[4779]: I0320 15:27:30.708748 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2dc621f5-0ad3-4351-bd37-082c3846b156","Type":"ContainerStarted","Data":"7f6078875623f53ee11f377737a4bcf5843ca13df1ff1298e735300619999602"} Mar 20 15:27:30 crc kubenswrapper[4779]: I0320 15:27:30.710910 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqlmj" event={"ID":"437c3121-8b7a-48b1-b805-540a41e89b6a","Type":"ContainerStarted","Data":"ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca"} Mar 20 15:27:30 crc kubenswrapper[4779]: I0320 15:27:30.712671 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" event={"ID":"981a3e67-301c-4415-ac4d-195543bf3d16","Type":"ContainerStarted","Data":"7c1c8c3635f7cbed3eccf6d45dac8fa73f7d834b136ee5f731bfcc7996606f00"} Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.729936 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhgrq" event={"ID":"57ad9993-5b81-4487-8f70-37e41aca1678","Type":"ContainerStarted","Data":"e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c"} Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.732409 4779 generic.go:334] "Generic (PLEG): container finished" podID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerID="ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca" exitCode=0 Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.732468 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqlmj" event={"ID":"437c3121-8b7a-48b1-b805-540a41e89b6a","Type":"ContainerDied","Data":"ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca"} Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.734176 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" event={"ID":"981a3e67-301c-4415-ac4d-195543bf3d16","Type":"ContainerStarted","Data":"90939c6421e2054d9fbc866ab60b309286b46fa26869a27b8dabffcead5f85ba"} Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.734961 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.736010 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" event={"ID":"edb061cd-aa53-4116-8d58-8dcf73d58d05","Type":"ContainerStarted","Data":"4d3b5e8f7afdd1945a2826bdc8abe6c4d7505a008bef9baa19b781936fa1353a"} Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.737271 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.739566 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.740358 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khpjc" event={"ID":"6039db56-150f-48d0-84a3-511c6a67daad","Type":"ContainerStarted","Data":"d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e"} Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.743167 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.744286 4779 generic.go:334] "Generic (PLEG): container finished" podID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerID="f8f46bb86240065bb1cb0d10530da1e591e113375bda52618382ceaa4b54eca2" exitCode=0 Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.744472 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmgpk" event={"ID":"cef14e53-c107-4bb0-85bc-40b8189fd346","Type":"ContainerDied","Data":"f8f46bb86240065bb1cb0d10530da1e591e113375bda52618382ceaa4b54eca2"} Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.747421 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-25rn6" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerName="registry-server" containerID="cri-o://0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da" gracePeriod=2 Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.747953 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2dc621f5-0ad3-4351-bd37-082c3846b156","Type":"ContainerStarted","Data":"4a9ae52f372c15e8e6f139c1707e2f2b6452a2cbb7ae24615481947dcf88952f"} Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.774358 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.774339528 podStartE2EDuration="3.774339528s" podCreationTimestamp="2026-03-20 15:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:27:31.767675379 +0000 UTC m=+268.730191179" watchObservedRunningTime="2026-03-20 15:27:31.774339528 +0000 UTC m=+268.736855328" Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.828658 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" podStartSLOduration=6.828635079 podStartE2EDuration="6.828635079s" podCreationTimestamp="2026-03-20 15:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:27:31.823874639 +0000 UTC m=+268.786390439" watchObservedRunningTime="2026-03-20 15:27:31.828635079 +0000 UTC m=+268.791150879" Mar 20 15:27:31 crc kubenswrapper[4779]: I0320 15:27:31.880684 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" podStartSLOduration=6.880650242 podStartE2EDuration="6.880650242s" podCreationTimestamp="2026-03-20 15:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:27:31.874461526 +0000 UTC m=+268.836977326" watchObservedRunningTime="2026-03-20 15:27:31.880650242 +0000 UTC m=+268.843166032" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.394814 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.438625 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpvx2\" (UniqueName: \"kubernetes.io/projected/442385a1-f11d-4412-8e24-3d6e49ec4930-kube-api-access-xpvx2\") pod \"442385a1-f11d-4412-8e24-3d6e49ec4930\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.438705 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-catalog-content\") pod \"442385a1-f11d-4412-8e24-3d6e49ec4930\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.438728 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-utilities\") pod \"442385a1-f11d-4412-8e24-3d6e49ec4930\" (UID: \"442385a1-f11d-4412-8e24-3d6e49ec4930\") " Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.439408 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-utilities" (OuterVolumeSpecName: "utilities") pod "442385a1-f11d-4412-8e24-3d6e49ec4930" (UID: "442385a1-f11d-4412-8e24-3d6e49ec4930"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.444677 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442385a1-f11d-4412-8e24-3d6e49ec4930-kube-api-access-xpvx2" (OuterVolumeSpecName: "kube-api-access-xpvx2") pod "442385a1-f11d-4412-8e24-3d6e49ec4930" (UID: "442385a1-f11d-4412-8e24-3d6e49ec4930"). InnerVolumeSpecName "kube-api-access-xpvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.464430 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "442385a1-f11d-4412-8e24-3d6e49ec4930" (UID: "442385a1-f11d-4412-8e24-3d6e49ec4930"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.540556 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.540589 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442385a1-f11d-4412-8e24-3d6e49ec4930-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.540599 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpvx2\" (UniqueName: \"kubernetes.io/projected/442385a1-f11d-4412-8e24-3d6e49ec4930-kube-api-access-xpvx2\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.753745 4779 generic.go:334] "Generic (PLEG): container finished" podID="6039db56-150f-48d0-84a3-511c6a67daad" containerID="d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e" exitCode=0 Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.753825 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khpjc" event={"ID":"6039db56-150f-48d0-84a3-511c6a67daad","Type":"ContainerDied","Data":"d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e"} Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.756260 4779 generic.go:334] "Generic (PLEG): container finished" podID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerID="0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da" exitCode=0 Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.756298 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-25rn6" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.756335 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25rn6" event={"ID":"442385a1-f11d-4412-8e24-3d6e49ec4930","Type":"ContainerDied","Data":"0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da"} Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.756366 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25rn6" event={"ID":"442385a1-f11d-4412-8e24-3d6e49ec4930","Type":"ContainerDied","Data":"60a4ba918c31500ddc9916c058243c44c65e1386ef74121fc9f702fb8453c9d6"} Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.756383 4779 scope.go:117] "RemoveContainer" containerID="0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da" Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.758381 4779 generic.go:334] "Generic (PLEG): container finished" podID="57ad9993-5b81-4487-8f70-37e41aca1678" containerID="e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c" exitCode=0 Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.758440 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhgrq" event={"ID":"57ad9993-5b81-4487-8f70-37e41aca1678","Type":"ContainerDied","Data":"e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c"} Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.761738 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqlmj" event={"ID":"437c3121-8b7a-48b1-b805-540a41e89b6a","Type":"ContainerStarted","Data":"9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4"} Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.793338 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-25rn6"] Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.799941 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-25rn6"] Mar 20 15:27:32 crc kubenswrapper[4779]: I0320 15:27:32.839735 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tqlmj" podStartSLOduration=31.140794611 podStartE2EDuration="43.839716018s" podCreationTimestamp="2026-03-20 15:26:49 +0000 UTC" firstStartedPulling="2026-03-20 15:27:19.597331529 +0000 UTC m=+256.559847329" lastFinishedPulling="2026-03-20 15:27:32.296252936 +0000 UTC m=+269.258768736" observedRunningTime="2026-03-20 15:27:32.837872042 +0000 UTC m=+269.800387852" watchObservedRunningTime="2026-03-20 15:27:32.839716018 +0000 UTC m=+269.802231818" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.063730 4779 scope.go:117] "RemoveContainer" containerID="7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.079510 4779 scope.go:117] "RemoveContainer" containerID="c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.110800 4779 scope.go:117] "RemoveContainer" containerID="0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da" Mar 20 15:27:33 crc kubenswrapper[4779]: E0320 15:27:33.111218 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da\": container with ID starting with 0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da not found: ID does not exist" containerID="0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.111267 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da"} err="failed to get container status \"0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da\": rpc error: code = NotFound desc = could not find container \"0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da\": container with ID starting with 0840e64ea2c665c7f4514b3845a031d7f85bea2303dee328ad5fd364d14029da not found: ID does not exist" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.111296 4779 scope.go:117] "RemoveContainer" containerID="7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653" Mar 20 15:27:33 crc kubenswrapper[4779]: E0320 15:27:33.111531 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653\": container with ID starting with 7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653 not found: ID does not exist" containerID="7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.111560 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653"} err="failed to get container status \"7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653\": rpc error: code = NotFound desc = could not find container \"7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653\": container with ID starting with 7634ea0c9b6d7237cfcdf99e131848ac1c582b1f1cbb5486f762a5738b16d653 not found: ID does not exist" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.111702 4779 scope.go:117] "RemoveContainer" containerID="c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459" Mar 20 15:27:33 crc kubenswrapper[4779]: E0320 15:27:33.111935 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459\": container with ID starting with c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459 not found: ID does not exist" containerID="c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.111962 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459"} err="failed to get container status \"c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459\": rpc error: code = NotFound desc = could not find container \"c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459\": container with ID starting with c36123db3f911a3069b049118d2a6e1cb86d6c6cad459462a5993490ad630459 not found: ID does not exist" Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.770033 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmgpk" event={"ID":"cef14e53-c107-4bb0-85bc-40b8189fd346","Type":"ContainerStarted","Data":"aa857f70e1a12e37c3ca7e0b61f6c1caa689db6bc117550a8f447958039d6727"} Mar 20 15:27:33 crc kubenswrapper[4779]: I0320 15:27:33.818848 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" path="/var/lib/kubelet/pods/442385a1-f11d-4412-8e24-3d6e49ec4930/volumes" Mar 20 15:27:34 crc kubenswrapper[4779]: I0320 15:27:34.794662 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vmgpk" podStartSLOduration=32.304129568 podStartE2EDuration="45.794630919s" podCreationTimestamp="2026-03-20 15:26:49 +0000 UTC" firstStartedPulling="2026-03-20 15:27:19.573297605 +0000 UTC m=+256.535813405" lastFinishedPulling="2026-03-20 15:27:33.063798956 +0000 UTC m=+270.026314756" observedRunningTime="2026-03-20 15:27:34.791553951 +0000 UTC m=+271.754069751" watchObservedRunningTime="2026-03-20 15:27:34.794630919 +0000 UTC m=+271.757146729" Mar 20 15:27:36 crc kubenswrapper[4779]: I0320 15:27:36.786545 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhgrq" event={"ID":"57ad9993-5b81-4487-8f70-37e41aca1678","Type":"ContainerStarted","Data":"bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8"} Mar 20 15:27:36 crc kubenswrapper[4779]: I0320 15:27:36.790974 4779 generic.go:334] "Generic (PLEG): container finished" podID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerID="16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996" exitCode=0 Mar 20 15:27:36 crc kubenswrapper[4779]: I0320 15:27:36.791193 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gg9" event={"ID":"626d4bed-62b8-47e0-9599-1c6f949a3d22","Type":"ContainerDied","Data":"16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996"} Mar 20 15:27:36 crc kubenswrapper[4779]: I0320 15:27:36.805356 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mhgrq" podStartSLOduration=3.667132919 podStartE2EDuration="50.805336188s" podCreationTimestamp="2026-03-20 15:26:46 +0000 UTC" firstStartedPulling="2026-03-20 15:26:48.095440105 +0000 UTC m=+225.057955905" lastFinishedPulling="2026-03-20 15:27:35.233643374 +0000 UTC m=+272.196159174" observedRunningTime="2026-03-20 15:27:36.803204495 +0000 UTC m=+273.765720305" watchObservedRunningTime="2026-03-20 15:27:36.805336188 +0000 UTC m=+273.767851988" Mar 20 15:27:37 crc kubenswrapper[4779]: I0320 15:27:37.797616 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khpjc" event={"ID":"6039db56-150f-48d0-84a3-511c6a67daad","Type":"ContainerStarted","Data":"4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d"} Mar 20 15:27:37 crc kubenswrapper[4779]: I0320 15:27:37.823177 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khpjc" podStartSLOduration=2.419573465 podStartE2EDuration="51.823162368s" podCreationTimestamp="2026-03-20 15:26:46 +0000 UTC" firstStartedPulling="2026-03-20 15:26:48.140565 +0000 UTC m=+225.103080800" lastFinishedPulling="2026-03-20 15:27:37.544153903 +0000 UTC m=+274.506669703" observedRunningTime="2026-03-20 15:27:37.822751528 +0000 UTC m=+274.785267328" watchObservedRunningTime="2026-03-20 15:27:37.823162368 +0000 UTC m=+274.785678168" Mar 20 15:27:38 crc kubenswrapper[4779]: I0320 15:27:38.803663 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gg9" event={"ID":"626d4bed-62b8-47e0-9599-1c6f949a3d22","Type":"ContainerStarted","Data":"bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad"} Mar 20 15:27:38 crc kubenswrapper[4779]: I0320 15:27:38.806276 4779 generic.go:334] "Generic (PLEG): container finished" podID="f20db228-a34a-4734-ae22-53cd86de06ed" containerID="ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521" exitCode=0 Mar 20 15:27:38 crc kubenswrapper[4779]: I0320 15:27:38.806299 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtlz2" event={"ID":"f20db228-a34a-4734-ae22-53cd86de06ed","Type":"ContainerDied","Data":"ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521"} Mar 20 15:27:38 crc kubenswrapper[4779]: I0320 15:27:38.845507 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9gg9" podStartSLOduration=2.994873155 podStartE2EDuration="52.845491261s" podCreationTimestamp="2026-03-20 15:26:46 +0000 UTC" firstStartedPulling="2026-03-20 15:26:48.104861603 +0000 UTC m=+225.067377403" lastFinishedPulling="2026-03-20 15:27:37.955479709 +0000 UTC m=+274.917995509" observedRunningTime="2026-03-20 15:27:38.827424606 +0000 UTC m=+275.789940406" watchObservedRunningTime="2026-03-20 15:27:38.845491261 +0000 UTC m=+275.808007061" Mar 20 15:27:39 crc kubenswrapper[4779]: I0320 15:27:39.758461 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:27:39 crc kubenswrapper[4779]: I0320 15:27:39.758821 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:27:39 crc kubenswrapper[4779]: I0320 15:27:39.815620 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:27:39 crc kubenswrapper[4779]: I0320 15:27:39.815648 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtlz2" event={"ID":"f20db228-a34a-4734-ae22-53cd86de06ed","Type":"ContainerStarted","Data":"4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b"} Mar 20 15:27:39 crc kubenswrapper[4779]: I0320 15:27:39.836322 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qtlz2" podStartSLOduration=2.7211119630000002 podStartE2EDuration="53.836288389s" podCreationTimestamp="2026-03-20 15:26:46 +0000 UTC" firstStartedPulling="2026-03-20 15:26:48.09761113 +0000 UTC m=+225.060126930" lastFinishedPulling="2026-03-20 15:27:39.212787556 +0000 UTC m=+276.175303356" observedRunningTime="2026-03-20 15:27:39.833797995 +0000 UTC m=+276.796313795" watchObservedRunningTime="2026-03-20 15:27:39.836288389 +0000 UTC m=+276.798804189" Mar 20 15:27:39 crc kubenswrapper[4779]: I0320 15:27:39.874568 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:27:40 crc kubenswrapper[4779]: I0320 15:27:40.149176 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:27:40 crc kubenswrapper[4779]: I0320 15:27:40.149268 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:27:41 crc kubenswrapper[4779]: I0320 15:27:41.186205 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vmgpk" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="registry-server" probeResult="failure" output=< Mar 20 15:27:41 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 15:27:41 crc kubenswrapper[4779]: > Mar 20 15:27:44 crc kubenswrapper[4779]: I0320 15:27:44.956919 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd"] Mar 20 15:27:45 crc kubenswrapper[4779]: I0320 15:27:44.957492 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" podUID="981a3e67-301c-4415-ac4d-195543bf3d16" containerName="controller-manager" containerID="cri-o://90939c6421e2054d9fbc866ab60b309286b46fa26869a27b8dabffcead5f85ba" gracePeriod=30 Mar 20 15:27:45 crc kubenswrapper[4779]: I0320 15:27:44.969987 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s"] Mar 20 15:27:45 crc kubenswrapper[4779]: I0320 15:27:44.970295 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" podUID="edb061cd-aa53-4116-8d58-8dcf73d58d05" containerName="route-controller-manager" containerID="cri-o://4d3b5e8f7afdd1945a2826bdc8abe6c4d7505a008bef9baa19b781936fa1353a" gracePeriod=30 Mar 20 15:27:45 crc kubenswrapper[4779]: I0320 15:27:45.856152 4779 generic.go:334] "Generic (PLEG): container finished" podID="981a3e67-301c-4415-ac4d-195543bf3d16" containerID="90939c6421e2054d9fbc866ab60b309286b46fa26869a27b8dabffcead5f85ba" exitCode=0 Mar 20 15:27:45 crc kubenswrapper[4779]: I0320 15:27:45.856449 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" event={"ID":"981a3e67-301c-4415-ac4d-195543bf3d16","Type":"ContainerDied","Data":"90939c6421e2054d9fbc866ab60b309286b46fa26869a27b8dabffcead5f85ba"} Mar 20 15:27:45 crc kubenswrapper[4779]: I0320 15:27:45.857726 4779 generic.go:334] "Generic (PLEG): container finished" podID="edb061cd-aa53-4116-8d58-8dcf73d58d05" containerID="4d3b5e8f7afdd1945a2826bdc8abe6c4d7505a008bef9baa19b781936fa1353a" exitCode=0 Mar 20 15:27:45 crc kubenswrapper[4779]: I0320 15:27:45.857761 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" event={"ID":"edb061cd-aa53-4116-8d58-8dcf73d58d05","Type":"ContainerDied","Data":"4d3b5e8f7afdd1945a2826bdc8abe6c4d7505a008bef9baa19b781936fa1353a"} Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.586305 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.586620 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.650513 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.724809 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.725252 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.769381 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.800043 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.832951 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-config\") pod \"edb061cd-aa53-4116-8d58-8dcf73d58d05\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.833022 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs9xw\" (UniqueName: \"kubernetes.io/projected/edb061cd-aa53-4116-8d58-8dcf73d58d05-kube-api-access-qs9xw\") pod \"edb061cd-aa53-4116-8d58-8dcf73d58d05\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.833749 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb061cd-aa53-4116-8d58-8dcf73d58d05-serving-cert\") pod \"edb061cd-aa53-4116-8d58-8dcf73d58d05\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.833835 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-client-ca\") pod \"edb061cd-aa53-4116-8d58-8dcf73d58d05\" (UID: \"edb061cd-aa53-4116-8d58-8dcf73d58d05\") " Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.836653 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-config" (OuterVolumeSpecName: "config") pod "edb061cd-aa53-4116-8d58-8dcf73d58d05" (UID: "edb061cd-aa53-4116-8d58-8dcf73d58d05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.837700 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-client-ca" (OuterVolumeSpecName: "client-ca") pod "edb061cd-aa53-4116-8d58-8dcf73d58d05" (UID: "edb061cd-aa53-4116-8d58-8dcf73d58d05"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.850024 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj"] Mar 20 15:27:46 crc kubenswrapper[4779]: E0320 15:27:46.850317 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb061cd-aa53-4116-8d58-8dcf73d58d05" containerName="route-controller-manager" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.850331 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb061cd-aa53-4116-8d58-8dcf73d58d05" containerName="route-controller-manager" Mar 20 15:27:46 crc kubenswrapper[4779]: E0320 15:27:46.850351 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerName="extract-utilities" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.850358 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerName="extract-utilities" Mar 20 15:27:46 crc kubenswrapper[4779]: E0320 15:27:46.850364 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerName="registry-server" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.850371 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerName="registry-server" Mar 20 15:27:46 crc kubenswrapper[4779]: E0320 15:27:46.850383 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerName="extract-content" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.850394 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerName="extract-content" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.850518 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb061cd-aa53-4116-8d58-8dcf73d58d05" containerName="route-controller-manager" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.850530 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="442385a1-f11d-4412-8e24-3d6e49ec4930" containerName="registry-server" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.851661 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.853190 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj"] Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.862023 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb061cd-aa53-4116-8d58-8dcf73d58d05-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "edb061cd-aa53-4116-8d58-8dcf73d58d05" (UID: "edb061cd-aa53-4116-8d58-8dcf73d58d05"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.862207 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb061cd-aa53-4116-8d58-8dcf73d58d05-kube-api-access-qs9xw" (OuterVolumeSpecName: "kube-api-access-qs9xw") pod "edb061cd-aa53-4116-8d58-8dcf73d58d05" (UID: "edb061cd-aa53-4116-8d58-8dcf73d58d05"). InnerVolumeSpecName "kube-api-access-qs9xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.870712 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" event={"ID":"edb061cd-aa53-4116-8d58-8dcf73d58d05","Type":"ContainerDied","Data":"cabfb6c512c51a70bbc88a477ebc46defd5a93d9642e3cb564f68d07763db302"} Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.870785 4779 scope.go:117] "RemoveContainer" containerID="4d3b5e8f7afdd1945a2826bdc8abe6c4d7505a008bef9baa19b781936fa1353a" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.870897 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.909822 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s"] Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.914503 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s"] Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.919674 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.923149 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.933324 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.933372 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.935893 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-client-ca\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.936140 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e74ef8b-bcb6-4a39-a2b0-892343298556-serving-cert\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.936201 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99xn6\" (UniqueName: \"kubernetes.io/projected/8e74ef8b-bcb6-4a39-a2b0-892343298556-kube-api-access-99xn6\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.936224 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-config\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.936294 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.936305 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb061cd-aa53-4116-8d58-8dcf73d58d05-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.936315 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs9xw\" (UniqueName: \"kubernetes.io/projected/edb061cd-aa53-4116-8d58-8dcf73d58d05-kube-api-access-qs9xw\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.936323 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb061cd-aa53-4116-8d58-8dcf73d58d05-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:46 crc kubenswrapper[4779]: I0320 15:27:46.975574 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.003674 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037044 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-proxy-ca-bundles\") pod \"981a3e67-301c-4415-ac4d-195543bf3d16\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037138 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981a3e67-301c-4415-ac4d-195543bf3d16-serving-cert\") pod \"981a3e67-301c-4415-ac4d-195543bf3d16\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037217 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfdbs\" (UniqueName: \"kubernetes.io/projected/981a3e67-301c-4415-ac4d-195543bf3d16-kube-api-access-gfdbs\") pod \"981a3e67-301c-4415-ac4d-195543bf3d16\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037249 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-config\") pod \"981a3e67-301c-4415-ac4d-195543bf3d16\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037277 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-client-ca\") pod \"981a3e67-301c-4415-ac4d-195543bf3d16\" (UID: \"981a3e67-301c-4415-ac4d-195543bf3d16\") " Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037506 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-client-ca\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037630 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e74ef8b-bcb6-4a39-a2b0-892343298556-serving-cert\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037669 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99xn6\" (UniqueName: \"kubernetes.io/projected/8e74ef8b-bcb6-4a39-a2b0-892343298556-kube-api-access-99xn6\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037692 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-config\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.037890 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "981a3e67-301c-4415-ac4d-195543bf3d16" (UID: "981a3e67-301c-4415-ac4d-195543bf3d16"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.038209 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-client-ca" (OuterVolumeSpecName: "client-ca") pod "981a3e67-301c-4415-ac4d-195543bf3d16" (UID: "981a3e67-301c-4415-ac4d-195543bf3d16"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.038528 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-config" (OuterVolumeSpecName: "config") pod "981a3e67-301c-4415-ac4d-195543bf3d16" (UID: "981a3e67-301c-4415-ac4d-195543bf3d16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.039423 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-config\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.039596 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-client-ca\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.040949 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/981a3e67-301c-4415-ac4d-195543bf3d16-kube-api-access-gfdbs" (OuterVolumeSpecName: "kube-api-access-gfdbs") pod "981a3e67-301c-4415-ac4d-195543bf3d16" (UID: "981a3e67-301c-4415-ac4d-195543bf3d16"). InnerVolumeSpecName "kube-api-access-gfdbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.041318 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981a3e67-301c-4415-ac4d-195543bf3d16-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "981a3e67-301c-4415-ac4d-195543bf3d16" (UID: "981a3e67-301c-4415-ac4d-195543bf3d16"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.042339 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e74ef8b-bcb6-4a39-a2b0-892343298556-serving-cert\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.052049 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99xn6\" (UniqueName: \"kubernetes.io/projected/8e74ef8b-bcb6-4a39-a2b0-892343298556-kube-api-access-99xn6\") pod \"route-controller-manager-77cff4bd67-8jcsj\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.126289 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.126563 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.138720 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.138862 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981a3e67-301c-4415-ac4d-195543bf3d16-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.138924 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfdbs\" (UniqueName: \"kubernetes.io/projected/981a3e67-301c-4415-ac4d-195543bf3d16-kube-api-access-gfdbs\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.138987 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.139055 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/981a3e67-301c-4415-ac4d-195543bf3d16-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.168758 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.186103 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.556790 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj"] Mar 20 15:27:47 crc kubenswrapper[4779]: W0320 15:27:47.559330 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e74ef8b_bcb6_4a39_a2b0_892343298556.slice/crio-5de73962b14cd5e1ab06509d4bf686d033dd3b86cdfa73d7436a73f267cd9a17 WatchSource:0}: Error finding container 5de73962b14cd5e1ab06509d4bf686d033dd3b86cdfa73d7436a73f267cd9a17: Status 404 returned error can't find the container with id 5de73962b14cd5e1ab06509d4bf686d033dd3b86cdfa73d7436a73f267cd9a17 Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.744597 4779 patch_prober.go:28] interesting pod/controller-manager-5bcd9c8bf6-rtbnd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.744979 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" podUID="981a3e67-301c-4415-ac4d-195543bf3d16" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.751558 4779 patch_prober.go:28] interesting pod/route-controller-manager-59fd4869dd-fnj5s container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.751705 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59fd4869dd-fnj5s" podUID="edb061cd-aa53-4116-8d58-8dcf73d58d05" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.816066 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb061cd-aa53-4116-8d58-8dcf73d58d05" path="/var/lib/kubelet/pods/edb061cd-aa53-4116-8d58-8dcf73d58d05/volumes" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.877327 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" event={"ID":"8e74ef8b-bcb6-4a39-a2b0-892343298556","Type":"ContainerStarted","Data":"5de73962b14cd5e1ab06509d4bf686d033dd3b86cdfa73d7436a73f267cd9a17"} Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.878701 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" event={"ID":"981a3e67-301c-4415-ac4d-195543bf3d16","Type":"ContainerDied","Data":"7c1c8c3635f7cbed3eccf6d45dac8fa73f7d834b136ee5f731bfcc7996606f00"} Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.879068 4779 scope.go:117] "RemoveContainer" containerID="90939c6421e2054d9fbc866ab60b309286b46fa26869a27b8dabffcead5f85ba" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.879219 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.899498 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd"] Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.902864 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bcd9c8bf6-rtbnd"] Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.931009 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:27:47 crc kubenswrapper[4779]: I0320 15:27:47.936449 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:27:48 crc kubenswrapper[4779]: I0320 15:27:48.891136 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" event={"ID":"8e74ef8b-bcb6-4a39-a2b0-892343298556","Type":"ContainerStarted","Data":"eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f"} Mar 20 15:27:48 crc kubenswrapper[4779]: I0320 15:27:48.909617 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" podStartSLOduration=4.909594564 podStartE2EDuration="4.909594564s" podCreationTimestamp="2026-03-20 15:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:27:48.908217079 +0000 UTC m=+285.870732889" watchObservedRunningTime="2026-03-20 15:27:48.909594564 +0000 UTC m=+285.872110364" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.434882 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq"] Mar 20 15:27:49 crc kubenswrapper[4779]: E0320 15:27:49.435196 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="981a3e67-301c-4415-ac4d-195543bf3d16" containerName="controller-manager" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.435213 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="981a3e67-301c-4415-ac4d-195543bf3d16" containerName="controller-manager" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.435518 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="981a3e67-301c-4415-ac4d-195543bf3d16" containerName="controller-manager" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.435984 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.439839 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.440223 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.440936 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.444194 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.444298 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.444328 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.446556 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq"] Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.449796 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.465267 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-client-ca\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.465548 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1490252c-df79-4687-a24f-9905c70efa9f-serving-cert\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.465678 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-proxy-ca-bundles\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.465793 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-config\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.465924 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhrnl\" (UniqueName: \"kubernetes.io/projected/1490252c-df79-4687-a24f-9905c70efa9f-kube-api-access-mhrnl\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.566859 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-proxy-ca-bundles\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.566953 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-config\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.567062 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhrnl\" (UniqueName: \"kubernetes.io/projected/1490252c-df79-4687-a24f-9905c70efa9f-kube-api-access-mhrnl\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.567128 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-client-ca\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.567152 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1490252c-df79-4687-a24f-9905c70efa9f-serving-cert\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.568069 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-client-ca\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.568404 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-config\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.569065 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-proxy-ca-bundles\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.579182 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1490252c-df79-4687-a24f-9905c70efa9f-serving-cert\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.584655 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhrnl\" (UniqueName: \"kubernetes.io/projected/1490252c-df79-4687-a24f-9905c70efa9f-kube-api-access-mhrnl\") pod \"controller-manager-59f47cbcfc-wr8cq\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.756853 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.814865 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="981a3e67-301c-4415-ac4d-195543bf3d16" path="/var/lib/kubelet/pods/981a3e67-301c-4415-ac4d-195543bf3d16/volumes" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.896513 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:49 crc kubenswrapper[4779]: I0320 15:27:49.900880 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.123476 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khpjc"] Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.124854 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-khpjc" podUID="6039db56-150f-48d0-84a3-511c6a67daad" containerName="registry-server" containerID="cri-o://4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d" gracePeriod=2 Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.149848 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq"] Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.189771 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.229339 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.439716 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.483757 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-utilities\") pod \"6039db56-150f-48d0-84a3-511c6a67daad\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.483823 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l28wf\" (UniqueName: \"kubernetes.io/projected/6039db56-150f-48d0-84a3-511c6a67daad-kube-api-access-l28wf\") pod \"6039db56-150f-48d0-84a3-511c6a67daad\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.483881 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-catalog-content\") pod \"6039db56-150f-48d0-84a3-511c6a67daad\" (UID: \"6039db56-150f-48d0-84a3-511c6a67daad\") " Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.485426 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-utilities" (OuterVolumeSpecName: "utilities") pod "6039db56-150f-48d0-84a3-511c6a67daad" (UID: "6039db56-150f-48d0-84a3-511c6a67daad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.492362 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6039db56-150f-48d0-84a3-511c6a67daad-kube-api-access-l28wf" (OuterVolumeSpecName: "kube-api-access-l28wf") pod "6039db56-150f-48d0-84a3-511c6a67daad" (UID: "6039db56-150f-48d0-84a3-511c6a67daad"). InnerVolumeSpecName "kube-api-access-l28wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.534001 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6039db56-150f-48d0-84a3-511c6a67daad" (UID: "6039db56-150f-48d0-84a3-511c6a67daad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.586629 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l28wf\" (UniqueName: \"kubernetes.io/projected/6039db56-150f-48d0-84a3-511c6a67daad-kube-api-access-l28wf\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.586917 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.586974 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6039db56-150f-48d0-84a3-511c6a67daad-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.903850 4779 generic.go:334] "Generic (PLEG): container finished" podID="6039db56-150f-48d0-84a3-511c6a67daad" containerID="4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d" exitCode=0 Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.903903 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khpjc" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.903923 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khpjc" event={"ID":"6039db56-150f-48d0-84a3-511c6a67daad","Type":"ContainerDied","Data":"4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d"} Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.903956 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khpjc" event={"ID":"6039db56-150f-48d0-84a3-511c6a67daad","Type":"ContainerDied","Data":"e4ffdf29691847c9a1697ed0a2920818c09d6ff0c415cdd2b4ee07c15ae21ee2"} Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.903975 4779 scope.go:117] "RemoveContainer" containerID="4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.909814 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" event={"ID":"1490252c-df79-4687-a24f-9905c70efa9f","Type":"ContainerStarted","Data":"b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47"} Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.909850 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" event={"ID":"1490252c-df79-4687-a24f-9905c70efa9f","Type":"ContainerStarted","Data":"acdb199d8c6d32d2fe07762255220815695e762e448d962cbfbdae39f1f88396"} Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.909865 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.916470 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.924304 4779 scope.go:117] "RemoveContainer" containerID="d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.940960 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" podStartSLOduration=6.940942145 podStartE2EDuration="6.940942145s" podCreationTimestamp="2026-03-20 15:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:27:50.938172495 +0000 UTC m=+287.900688295" watchObservedRunningTime="2026-03-20 15:27:50.940942145 +0000 UTC m=+287.903457945" Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.972780 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khpjc"] Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.974536 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-khpjc"] Mar 20 15:27:50 crc kubenswrapper[4779]: I0320 15:27:50.977871 4779 scope.go:117] "RemoveContainer" containerID="2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.003144 4779 scope.go:117] "RemoveContainer" containerID="4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d" Mar 20 15:27:51 crc kubenswrapper[4779]: E0320 15:27:51.003608 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d\": container with ID starting with 4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d not found: ID does not exist" containerID="4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.003655 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d"} err="failed to get container status \"4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d\": rpc error: code = NotFound desc = could not find container \"4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d\": container with ID starting with 4f992b1a34798d656a474db83cc8db7a9a8f47941390345ffd266d6c6483827d not found: ID does not exist" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.003684 4779 scope.go:117] "RemoveContainer" containerID="d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e" Mar 20 15:27:51 crc kubenswrapper[4779]: E0320 15:27:51.003982 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e\": container with ID starting with d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e not found: ID does not exist" containerID="d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.004013 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e"} err="failed to get container status \"d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e\": rpc error: code = NotFound desc = could not find container \"d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e\": container with ID starting with d8bab074df61e899a1d8e57c27121db87db8e2c56a0c26f1118fc6fee4825f2e not found: ID does not exist" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.004032 4779 scope.go:117] "RemoveContainer" containerID="2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce" Mar 20 15:27:51 crc kubenswrapper[4779]: E0320 15:27:51.004450 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce\": container with ID starting with 2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce not found: ID does not exist" containerID="2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.004475 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce"} err="failed to get container status \"2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce\": rpc error: code = NotFound desc = could not find container \"2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce\": container with ID starting with 2b5678ba31bc5c2bd89864ab1552b5413952a09c65a779efc6b798524c6fefce not found: ID does not exist" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.119954 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9gg9"] Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.120202 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9gg9" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerName="registry-server" containerID="cri-o://bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad" gracePeriod=2 Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.493410 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.600924 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jldv4\" (UniqueName: \"kubernetes.io/projected/626d4bed-62b8-47e0-9599-1c6f949a3d22-kube-api-access-jldv4\") pod \"626d4bed-62b8-47e0-9599-1c6f949a3d22\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.601098 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-catalog-content\") pod \"626d4bed-62b8-47e0-9599-1c6f949a3d22\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.601160 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-utilities\") pod \"626d4bed-62b8-47e0-9599-1c6f949a3d22\" (UID: \"626d4bed-62b8-47e0-9599-1c6f949a3d22\") " Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.602223 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-utilities" (OuterVolumeSpecName: "utilities") pod "626d4bed-62b8-47e0-9599-1c6f949a3d22" (UID: "626d4bed-62b8-47e0-9599-1c6f949a3d22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.604882 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626d4bed-62b8-47e0-9599-1c6f949a3d22-kube-api-access-jldv4" (OuterVolumeSpecName: "kube-api-access-jldv4") pod "626d4bed-62b8-47e0-9599-1c6f949a3d22" (UID: "626d4bed-62b8-47e0-9599-1c6f949a3d22"). InnerVolumeSpecName "kube-api-access-jldv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.658760 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "626d4bed-62b8-47e0-9599-1c6f949a3d22" (UID: "626d4bed-62b8-47e0-9599-1c6f949a3d22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.702588 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.702629 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/626d4bed-62b8-47e0-9599-1c6f949a3d22-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.702641 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jldv4\" (UniqueName: \"kubernetes.io/projected/626d4bed-62b8-47e0-9599-1c6f949a3d22-kube-api-access-jldv4\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.815605 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6039db56-150f-48d0-84a3-511c6a67daad" path="/var/lib/kubelet/pods/6039db56-150f-48d0-84a3-511c6a67daad/volumes" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.920603 4779 generic.go:334] "Generic (PLEG): container finished" podID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerID="bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad" exitCode=0 Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.920686 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gg9" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.920734 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gg9" event={"ID":"626d4bed-62b8-47e0-9599-1c6f949a3d22","Type":"ContainerDied","Data":"bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad"} Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.920919 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gg9" event={"ID":"626d4bed-62b8-47e0-9599-1c6f949a3d22","Type":"ContainerDied","Data":"fd27923b763297b68441ad63f2aff2c9d7d858b01308f390df575ab51ac585c5"} Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.920949 4779 scope.go:117] "RemoveContainer" containerID="bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.938092 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9gg9"] Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.940734 4779 scope.go:117] "RemoveContainer" containerID="16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.941520 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9gg9"] Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.956489 4779 scope.go:117] "RemoveContainer" containerID="91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.969511 4779 scope.go:117] "RemoveContainer" containerID="bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad" Mar 20 15:27:51 crc kubenswrapper[4779]: E0320 15:27:51.970032 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad\": container with ID starting with bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad not found: ID does not exist" containerID="bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.970076 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad"} err="failed to get container status \"bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad\": rpc error: code = NotFound desc = could not find container \"bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad\": container with ID starting with bfeef86b49d95ca40acf81349931229a1b6e85ff8033c825370f01d408d47bad not found: ID does not exist" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.970103 4779 scope.go:117] "RemoveContainer" containerID="16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996" Mar 20 15:27:51 crc kubenswrapper[4779]: E0320 15:27:51.970957 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996\": container with ID starting with 16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996 not found: ID does not exist" containerID="16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.971003 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996"} err="failed to get container status \"16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996\": rpc error: code = NotFound desc = could not find container \"16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996\": container with ID starting with 16efb269896cc764d66818140019bafa54f6d683bb06c70d66dfadb752acc996 not found: ID does not exist" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.971030 4779 scope.go:117] "RemoveContainer" containerID="91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396" Mar 20 15:27:51 crc kubenswrapper[4779]: E0320 15:27:51.971446 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396\": container with ID starting with 91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396 not found: ID does not exist" containerID="91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396" Mar 20 15:27:51 crc kubenswrapper[4779]: I0320 15:27:51.971468 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396"} err="failed to get container status \"91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396\": rpc error: code = NotFound desc = could not find container \"91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396\": container with ID starting with 91b751ebdcb7f17806491b8f4cbae5674029a2a63baacac718f8e9e9cfcbf396 not found: ID does not exist" Mar 20 15:27:52 crc kubenswrapper[4779]: I0320 15:27:52.807571 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" podUID="723d98da-617e-496c-94fe-5b49f9e8ac13" containerName="oauth-openshift" containerID="cri-o://3dfaa164a290c7848c2d94a71ab71b9315d019c44d58949c18d984bb18c687f4" gracePeriod=15 Mar 20 15:27:52 crc kubenswrapper[4779]: I0320 15:27:52.930731 4779 generic.go:334] "Generic (PLEG): container finished" podID="723d98da-617e-496c-94fe-5b49f9e8ac13" containerID="3dfaa164a290c7848c2d94a71ab71b9315d019c44d58949c18d984bb18c687f4" exitCode=0 Mar 20 15:27:52 crc kubenswrapper[4779]: I0320 15:27:52.931215 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" event={"ID":"723d98da-617e-496c-94fe-5b49f9e8ac13","Type":"ContainerDied","Data":"3dfaa164a290c7848c2d94a71ab71b9315d019c44d58949c18d984bb18c687f4"} Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.189277 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.330394 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-service-ca\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.330453 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-provider-selection\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.330483 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-trusted-ca-bundle\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.330532 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-cliconfig\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.330573 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-idp-0-file-data\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.330598 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-dir\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331259 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-login\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331314 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-serving-cert\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331352 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-error\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331392 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-router-certs\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331465 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-ocp-branding-template\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331500 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-policies\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331525 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-session\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331556 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn8gd\" (UniqueName: \"kubernetes.io/projected/723d98da-617e-496c-94fe-5b49f9e8ac13-kube-api-access-bn8gd\") pod \"723d98da-617e-496c-94fe-5b49f9e8ac13\" (UID: \"723d98da-617e-496c-94fe-5b49f9e8ac13\") " Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331604 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331676 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331688 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.331865 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.332366 4779 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.332389 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.332400 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.332410 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.332864 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.335241 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723d98da-617e-496c-94fe-5b49f9e8ac13-kube-api-access-bn8gd" (OuterVolumeSpecName: "kube-api-access-bn8gd") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "kube-api-access-bn8gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.335413 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.335844 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.335876 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.336304 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.336392 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.336615 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.351280 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.351433 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "723d98da-617e-496c-94fe-5b49f9e8ac13" (UID: "723d98da-617e-496c-94fe-5b49f9e8ac13"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434210 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434242 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434252 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434261 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434271 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434282 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434294 4779 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/723d98da-617e-496c-94fe-5b49f9e8ac13-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434302 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434312 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn8gd\" (UniqueName: \"kubernetes.io/projected/723d98da-617e-496c-94fe-5b49f9e8ac13-kube-api-access-bn8gd\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.434321 4779 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/723d98da-617e-496c-94fe-5b49f9e8ac13-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.520401 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vmgpk"] Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.521048 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vmgpk" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="registry-server" containerID="cri-o://aa857f70e1a12e37c3ca7e0b61f6c1caa689db6bc117550a8f447958039d6727" gracePeriod=2 Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.819318 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" path="/var/lib/kubelet/pods/626d4bed-62b8-47e0-9599-1c6f949a3d22/volumes" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.939712 4779 generic.go:334] "Generic (PLEG): container finished" podID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerID="aa857f70e1a12e37c3ca7e0b61f6c1caa689db6bc117550a8f447958039d6727" exitCode=0 Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.939777 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmgpk" event={"ID":"cef14e53-c107-4bb0-85bc-40b8189fd346","Type":"ContainerDied","Data":"aa857f70e1a12e37c3ca7e0b61f6c1caa689db6bc117550a8f447958039d6727"} Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.941264 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" event={"ID":"723d98da-617e-496c-94fe-5b49f9e8ac13","Type":"ContainerDied","Data":"2458700043c23792a2782debf64a8858cacebfd1b0961698aabc24797f9bbbc8"} Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.941289 4779 scope.go:117] "RemoveContainer" containerID="3dfaa164a290c7848c2d94a71ab71b9315d019c44d58949c18d984bb18c687f4" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.941396 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h48bp" Mar 20 15:27:53 crc kubenswrapper[4779]: I0320 15:27:53.986205 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:53.999956 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h48bp"] Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.003991 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h48bp"] Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.046614 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc7xm\" (UniqueName: \"kubernetes.io/projected/cef14e53-c107-4bb0-85bc-40b8189fd346-kube-api-access-vc7xm\") pod \"cef14e53-c107-4bb0-85bc-40b8189fd346\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.046720 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-catalog-content\") pod \"cef14e53-c107-4bb0-85bc-40b8189fd346\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.046783 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-utilities\") pod \"cef14e53-c107-4bb0-85bc-40b8189fd346\" (UID: \"cef14e53-c107-4bb0-85bc-40b8189fd346\") " Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.047742 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-utilities" (OuterVolumeSpecName: "utilities") pod "cef14e53-c107-4bb0-85bc-40b8189fd346" (UID: "cef14e53-c107-4bb0-85bc-40b8189fd346"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.060502 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef14e53-c107-4bb0-85bc-40b8189fd346-kube-api-access-vc7xm" (OuterVolumeSpecName: "kube-api-access-vc7xm") pod "cef14e53-c107-4bb0-85bc-40b8189fd346" (UID: "cef14e53-c107-4bb0-85bc-40b8189fd346"). InnerVolumeSpecName "kube-api-access-vc7xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.148321 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc7xm\" (UniqueName: \"kubernetes.io/projected/cef14e53-c107-4bb0-85bc-40b8189fd346-kube-api-access-vc7xm\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.148355 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.179507 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cef14e53-c107-4bb0-85bc-40b8189fd346" (UID: "cef14e53-c107-4bb0-85bc-40b8189fd346"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.250497 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef14e53-c107-4bb0-85bc-40b8189fd346-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.948051 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmgpk" event={"ID":"cef14e53-c107-4bb0-85bc-40b8189fd346","Type":"ContainerDied","Data":"6d01c127b86ab0150c7d3704c209c5d86a1f19658ab2a86666da82a803d4e164"} Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.948136 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vmgpk" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.948447 4779 scope.go:117] "RemoveContainer" containerID="aa857f70e1a12e37c3ca7e0b61f6c1caa689db6bc117550a8f447958039d6727" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.975310 4779 scope.go:117] "RemoveContainer" containerID="f8f46bb86240065bb1cb0d10530da1e591e113375bda52618382ceaa4b54eca2" Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.980829 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vmgpk"] Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.986143 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vmgpk"] Mar 20 15:27:54 crc kubenswrapper[4779]: I0320 15:27:54.997365 4779 scope.go:117] "RemoveContainer" containerID="f35cf6101b336d0e7cb5d44a7aa0ccfafe5100e7a2b403053e942a94ac990153" Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.149931 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.149996 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.150038 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.150684 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.150747 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f" gracePeriod=600 Mar 20 15:27:55 crc kubenswrapper[4779]: E0320 15:27:55.575186 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod451fc579_db57_4b36_a775_6d2986de3efc.slice/crio-conmon-498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.815434 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723d98da-617e-496c-94fe-5b49f9e8ac13" path="/var/lib/kubelet/pods/723d98da-617e-496c-94fe-5b49f9e8ac13/volumes" Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.816315 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" path="/var/lib/kubelet/pods/cef14e53-c107-4bb0-85bc-40b8189fd346/volumes" Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.954801 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f" exitCode=0 Mar 20 15:27:55 crc kubenswrapper[4779]: I0320 15:27:55.954857 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f"} Mar 20 15:27:56 crc kubenswrapper[4779]: I0320 15:27:56.962904 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"36c8a2a60794cd5ada94d0dde0b6b6073b2ecf6d648323c7aec9dfc0a0b52837"} Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439036 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6d584df96b-pw8fn"] Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439243 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723d98da-617e-496c-94fe-5b49f9e8ac13" containerName="oauth-openshift" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439254 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="723d98da-617e-496c-94fe-5b49f9e8ac13" containerName="oauth-openshift" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439264 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6039db56-150f-48d0-84a3-511c6a67daad" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439269 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6039db56-150f-48d0-84a3-511c6a67daad" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439278 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6039db56-150f-48d0-84a3-511c6a67daad" containerName="extract-content" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439286 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6039db56-150f-48d0-84a3-511c6a67daad" containerName="extract-content" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439293 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerName="extract-content" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439299 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerName="extract-content" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439306 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="extract-content" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439311 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="extract-content" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439325 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6039db56-150f-48d0-84a3-511c6a67daad" containerName="extract-utilities" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439330 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6039db56-150f-48d0-84a3-511c6a67daad" containerName="extract-utilities" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439338 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439343 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439352 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="extract-utilities" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439357 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="extract-utilities" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439364 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerName="extract-utilities" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439369 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerName="extract-utilities" Mar 20 15:27:59 crc kubenswrapper[4779]: E0320 15:27:59.439379 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439384 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439462 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="626d4bed-62b8-47e0-9599-1c6f949a3d22" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439481 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef14e53-c107-4bb0-85bc-40b8189fd346" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439499 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6039db56-150f-48d0-84a3-511c6a67daad" containerName="registry-server" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439509 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="723d98da-617e-496c-94fe-5b49f9e8ac13" containerName="oauth-openshift" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.439950 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.446228 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.446717 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.446851 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.446965 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.447142 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.447245 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.447356 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.447475 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.452892 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.452933 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.453291 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.453545 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.459004 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.464287 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d584df96b-pw8fn"] Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.464620 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.479699 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.522719 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.522765 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-error\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.522814 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-audit-dir\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.522835 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-login\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.522855 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.522876 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.522897 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.523000 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-audit-policies\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.523081 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.523134 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.523218 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.523247 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.523318 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-session\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.523374 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6gs\" (UniqueName: \"kubernetes.io/projected/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-kube-api-access-rx6gs\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624735 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624790 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624819 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-audit-policies\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624836 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624862 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624879 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624912 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624928 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624948 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-session\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624974 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6gs\" (UniqueName: \"kubernetes.io/projected/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-kube-api-access-rx6gs\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.624994 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.625009 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-error\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.625040 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-audit-dir\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.625060 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-login\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.627175 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-audit-dir\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.627311 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-audit-policies\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.627961 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.628424 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.628650 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.630769 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.631042 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.631208 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.631343 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.631619 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-session\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.631848 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.632138 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-login\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.632475 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-v4-0-config-user-template-error\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.641097 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6gs\" (UniqueName: \"kubernetes.io/projected/cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6-kube-api-access-rx6gs\") pod \"oauth-openshift-6d584df96b-pw8fn\" (UID: \"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6\") " pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:27:59 crc kubenswrapper[4779]: I0320 15:27:59.754511 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.132873 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567008-wtptq"] Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.133872 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567008-wtptq" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.135940 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.136159 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.136458 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.145898 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567008-wtptq"] Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.170740 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d584df96b-pw8fn"] Mar 20 15:28:00 crc kubenswrapper[4779]: W0320 15:28:00.177069 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc126fb0_9e2e_4ec0_b04a_3c8f9e464fd6.slice/crio-b816fcadd092f518a01ca8b492750d4d51a7c99d08e81c47e0cb738acbbb49a0 WatchSource:0}: Error finding container b816fcadd092f518a01ca8b492750d4d51a7c99d08e81c47e0cb738acbbb49a0: Status 404 returned error can't find the container with id b816fcadd092f518a01ca8b492750d4d51a7c99d08e81c47e0cb738acbbb49a0 Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.233792 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbm49\" (UniqueName: \"kubernetes.io/projected/d55c02de-f3fa-44bf-94c7-8879acc79040-kube-api-access-bbm49\") pod \"auto-csr-approver-29567008-wtptq\" (UID: \"d55c02de-f3fa-44bf-94c7-8879acc79040\") " pod="openshift-infra/auto-csr-approver-29567008-wtptq" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.335172 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbm49\" (UniqueName: \"kubernetes.io/projected/d55c02de-f3fa-44bf-94c7-8879acc79040-kube-api-access-bbm49\") pod \"auto-csr-approver-29567008-wtptq\" (UID: \"d55c02de-f3fa-44bf-94c7-8879acc79040\") " pod="openshift-infra/auto-csr-approver-29567008-wtptq" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.354212 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbm49\" (UniqueName: \"kubernetes.io/projected/d55c02de-f3fa-44bf-94c7-8879acc79040-kube-api-access-bbm49\") pod \"auto-csr-approver-29567008-wtptq\" (UID: \"d55c02de-f3fa-44bf-94c7-8879acc79040\") " pod="openshift-infra/auto-csr-approver-29567008-wtptq" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.449337 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567008-wtptq" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.882458 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567008-wtptq"] Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.983861 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567008-wtptq" event={"ID":"d55c02de-f3fa-44bf-94c7-8879acc79040","Type":"ContainerStarted","Data":"98c548f09eb7ede6db80178945bcefc3d01f4e6d801bb1e3fca6ec7528028a80"} Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.985307 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" event={"ID":"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6","Type":"ContainerStarted","Data":"33c9076ec1b254987787aa69c878d8a1dc2832cce4c4b58e71b040ed76cc82f9"} Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.985348 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" event={"ID":"cc126fb0-9e2e-4ec0-b04a-3c8f9e464fd6","Type":"ContainerStarted","Data":"b816fcadd092f518a01ca8b492750d4d51a7c99d08e81c47e0cb738acbbb49a0"} Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.985555 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:28:00 crc kubenswrapper[4779]: I0320 15:28:00.991186 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" Mar 20 15:28:01 crc kubenswrapper[4779]: I0320 15:28:01.006849 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6d584df96b-pw8fn" podStartSLOduration=34.006828244 podStartE2EDuration="34.006828244s" podCreationTimestamp="2026-03-20 15:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:28:01.00428965 +0000 UTC m=+297.966805470" watchObservedRunningTime="2026-03-20 15:28:01.006828244 +0000 UTC m=+297.969344044" Mar 20 15:28:02 crc kubenswrapper[4779]: I0320 15:28:02.999128 4779 generic.go:334] "Generic (PLEG): container finished" podID="d55c02de-f3fa-44bf-94c7-8879acc79040" containerID="195e0938b182a51a2c488b5b59059335c49c7135aeef7732aa7b01c4742f7f5e" exitCode=0 Mar 20 15:28:03 crc kubenswrapper[4779]: I0320 15:28:02.999463 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567008-wtptq" event={"ID":"d55c02de-f3fa-44bf-94c7-8879acc79040","Type":"ContainerDied","Data":"195e0938b182a51a2c488b5b59059335c49c7135aeef7732aa7b01c4742f7f5e"} Mar 20 15:28:04 crc kubenswrapper[4779]: I0320 15:28:04.366763 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567008-wtptq" Mar 20 15:28:04 crc kubenswrapper[4779]: I0320 15:28:04.421500 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbm49\" (UniqueName: \"kubernetes.io/projected/d55c02de-f3fa-44bf-94c7-8879acc79040-kube-api-access-bbm49\") pod \"d55c02de-f3fa-44bf-94c7-8879acc79040\" (UID: \"d55c02de-f3fa-44bf-94c7-8879acc79040\") " Mar 20 15:28:04 crc kubenswrapper[4779]: I0320 15:28:04.429306 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55c02de-f3fa-44bf-94c7-8879acc79040-kube-api-access-bbm49" (OuterVolumeSpecName: "kube-api-access-bbm49") pod "d55c02de-f3fa-44bf-94c7-8879acc79040" (UID: "d55c02de-f3fa-44bf-94c7-8879acc79040"). InnerVolumeSpecName "kube-api-access-bbm49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:28:04 crc kubenswrapper[4779]: I0320 15:28:04.522795 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbm49\" (UniqueName: \"kubernetes.io/projected/d55c02de-f3fa-44bf-94c7-8879acc79040-kube-api-access-bbm49\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:04 crc kubenswrapper[4779]: I0320 15:28:04.954730 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq"] Mar 20 15:28:04 crc kubenswrapper[4779]: I0320 15:28:04.954978 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" podUID="1490252c-df79-4687-a24f-9905c70efa9f" containerName="controller-manager" containerID="cri-o://b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47" gracePeriod=30 Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.010473 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567008-wtptq" event={"ID":"d55c02de-f3fa-44bf-94c7-8879acc79040","Type":"ContainerDied","Data":"98c548f09eb7ede6db80178945bcefc3d01f4e6d801bb1e3fca6ec7528028a80"} Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.010514 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98c548f09eb7ede6db80178945bcefc3d01f4e6d801bb1e3fca6ec7528028a80" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.010556 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567008-wtptq" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.059788 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj"] Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.059988 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" podUID="8e74ef8b-bcb6-4a39-a2b0-892343298556" containerName="route-controller-manager" containerID="cri-o://eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f" gracePeriod=30 Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.494279 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.503266 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635658 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99xn6\" (UniqueName: \"kubernetes.io/projected/8e74ef8b-bcb6-4a39-a2b0-892343298556-kube-api-access-99xn6\") pod \"8e74ef8b-bcb6-4a39-a2b0-892343298556\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635725 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e74ef8b-bcb6-4a39-a2b0-892343298556-serving-cert\") pod \"8e74ef8b-bcb6-4a39-a2b0-892343298556\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635775 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-client-ca\") pod \"8e74ef8b-bcb6-4a39-a2b0-892343298556\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635796 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-proxy-ca-bundles\") pod \"1490252c-df79-4687-a24f-9905c70efa9f\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635820 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-config\") pod \"1490252c-df79-4687-a24f-9905c70efa9f\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635842 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-config\") pod \"8e74ef8b-bcb6-4a39-a2b0-892343298556\" (UID: \"8e74ef8b-bcb6-4a39-a2b0-892343298556\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635862 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1490252c-df79-4687-a24f-9905c70efa9f-serving-cert\") pod \"1490252c-df79-4687-a24f-9905c70efa9f\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635877 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-client-ca\") pod \"1490252c-df79-4687-a24f-9905c70efa9f\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.635893 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhrnl\" (UniqueName: \"kubernetes.io/projected/1490252c-df79-4687-a24f-9905c70efa9f-kube-api-access-mhrnl\") pod \"1490252c-df79-4687-a24f-9905c70efa9f\" (UID: \"1490252c-df79-4687-a24f-9905c70efa9f\") " Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.636715 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-client-ca" (OuterVolumeSpecName: "client-ca") pod "8e74ef8b-bcb6-4a39-a2b0-892343298556" (UID: "8e74ef8b-bcb6-4a39-a2b0-892343298556"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.636786 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-config" (OuterVolumeSpecName: "config") pod "8e74ef8b-bcb6-4a39-a2b0-892343298556" (UID: "8e74ef8b-bcb6-4a39-a2b0-892343298556"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.637106 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-client-ca" (OuterVolumeSpecName: "client-ca") pod "1490252c-df79-4687-a24f-9905c70efa9f" (UID: "1490252c-df79-4687-a24f-9905c70efa9f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.637142 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-config" (OuterVolumeSpecName: "config") pod "1490252c-df79-4687-a24f-9905c70efa9f" (UID: "1490252c-df79-4687-a24f-9905c70efa9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.637532 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1490252c-df79-4687-a24f-9905c70efa9f" (UID: "1490252c-df79-4687-a24f-9905c70efa9f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.640095 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1490252c-df79-4687-a24f-9905c70efa9f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1490252c-df79-4687-a24f-9905c70efa9f" (UID: "1490252c-df79-4687-a24f-9905c70efa9f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.640236 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e74ef8b-bcb6-4a39-a2b0-892343298556-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8e74ef8b-bcb6-4a39-a2b0-892343298556" (UID: "8e74ef8b-bcb6-4a39-a2b0-892343298556"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.640491 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1490252c-df79-4687-a24f-9905c70efa9f-kube-api-access-mhrnl" (OuterVolumeSpecName: "kube-api-access-mhrnl") pod "1490252c-df79-4687-a24f-9905c70efa9f" (UID: "1490252c-df79-4687-a24f-9905c70efa9f"). InnerVolumeSpecName "kube-api-access-mhrnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.646242 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e74ef8b-bcb6-4a39-a2b0-892343298556-kube-api-access-99xn6" (OuterVolumeSpecName: "kube-api-access-99xn6") pod "8e74ef8b-bcb6-4a39-a2b0-892343298556" (UID: "8e74ef8b-bcb6-4a39-a2b0-892343298556"). InnerVolumeSpecName "kube-api-access-99xn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737436 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737474 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1490252c-df79-4687-a24f-9905c70efa9f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737483 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737491 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhrnl\" (UniqueName: \"kubernetes.io/projected/1490252c-df79-4687-a24f-9905c70efa9f-kube-api-access-mhrnl\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737501 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99xn6\" (UniqueName: \"kubernetes.io/projected/8e74ef8b-bcb6-4a39-a2b0-892343298556-kube-api-access-99xn6\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737511 4779 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e74ef8b-bcb6-4a39-a2b0-892343298556-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737518 4779 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e74ef8b-bcb6-4a39-a2b0-892343298556-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737526 4779 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:05 crc kubenswrapper[4779]: I0320 15:28:05.737534 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1490252c-df79-4687-a24f-9905c70efa9f-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.016741 4779 generic.go:334] "Generic (PLEG): container finished" podID="8e74ef8b-bcb6-4a39-a2b0-892343298556" containerID="eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f" exitCode=0 Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.016814 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.016847 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" event={"ID":"8e74ef8b-bcb6-4a39-a2b0-892343298556","Type":"ContainerDied","Data":"eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f"} Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.016882 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj" event={"ID":"8e74ef8b-bcb6-4a39-a2b0-892343298556","Type":"ContainerDied","Data":"5de73962b14cd5e1ab06509d4bf686d033dd3b86cdfa73d7436a73f267cd9a17"} Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.016901 4779 scope.go:117] "RemoveContainer" containerID="eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.020160 4779 generic.go:334] "Generic (PLEG): container finished" podID="1490252c-df79-4687-a24f-9905c70efa9f" containerID="b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47" exitCode=0 Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.020202 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.020205 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" event={"ID":"1490252c-df79-4687-a24f-9905c70efa9f","Type":"ContainerDied","Data":"b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47"} Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.020321 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq" event={"ID":"1490252c-df79-4687-a24f-9905c70efa9f","Type":"ContainerDied","Data":"acdb199d8c6d32d2fe07762255220815695e762e448d962cbfbdae39f1f88396"} Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.035767 4779 scope.go:117] "RemoveContainer" containerID="eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.037736 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq"] Mar 20 15:28:06 crc kubenswrapper[4779]: E0320 15:28:06.038297 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f\": container with ID starting with eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f not found: ID does not exist" containerID="eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.038340 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f"} err="failed to get container status \"eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f\": rpc error: code = NotFound desc = could not find container \"eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f\": container with ID starting with eea69623e2655271ba7a26e62ab6fb1925f4fdfccdacb662f3b806b07044c64f not found: ID does not exist" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.038362 4779 scope.go:117] "RemoveContainer" containerID="b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.040595 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59f47cbcfc-wr8cq"] Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.048925 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj"] Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.052948 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77cff4bd67-8jcsj"] Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.056270 4779 scope.go:117] "RemoveContainer" containerID="b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47" Mar 20 15:28:06 crc kubenswrapper[4779]: E0320 15:28:06.056640 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47\": container with ID starting with b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47 not found: ID does not exist" containerID="b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.056674 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47"} err="failed to get container status \"b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47\": rpc error: code = NotFound desc = could not find container \"b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47\": container with ID starting with b0b0e8010f6fe3617196dc034766693ef41ab57da0e4e702b8cbde157ad7ce47 not found: ID does not exist" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.449161 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4"] Mar 20 15:28:06 crc kubenswrapper[4779]: E0320 15:28:06.449370 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e74ef8b-bcb6-4a39-a2b0-892343298556" containerName="route-controller-manager" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.449381 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e74ef8b-bcb6-4a39-a2b0-892343298556" containerName="route-controller-manager" Mar 20 15:28:06 crc kubenswrapper[4779]: E0320 15:28:06.449389 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1490252c-df79-4687-a24f-9905c70efa9f" containerName="controller-manager" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.449394 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1490252c-df79-4687-a24f-9905c70efa9f" containerName="controller-manager" Mar 20 15:28:06 crc kubenswrapper[4779]: E0320 15:28:06.449406 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55c02de-f3fa-44bf-94c7-8879acc79040" containerName="oc" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.449413 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55c02de-f3fa-44bf-94c7-8879acc79040" containerName="oc" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.449503 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55c02de-f3fa-44bf-94c7-8879acc79040" containerName="oc" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.449511 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1490252c-df79-4687-a24f-9905c70efa9f" containerName="controller-manager" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.449520 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e74ef8b-bcb6-4a39-a2b0-892343298556" containerName="route-controller-manager" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.449905 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.451477 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.451654 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.451824 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.451845 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.451938 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.451939 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.453220 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58b48768f9-ldxwl"] Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.453889 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.458152 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.458226 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.458276 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.458361 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.458284 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.458331 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.467017 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.468512 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4"] Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.483741 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58b48768f9-ldxwl"] Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.547660 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf2zc\" (UniqueName: \"kubernetes.io/projected/82727253-41e6-49e0-8175-6af707b258e5-kube-api-access-hf2zc\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.547721 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/424265a9-84c6-4064-ae0a-964f4a311e86-client-ca\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.547744 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-proxy-ca-bundles\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.547768 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-client-ca\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.547946 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/424265a9-84c6-4064-ae0a-964f4a311e86-serving-cert\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.548030 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvj7j\" (UniqueName: \"kubernetes.io/projected/424265a9-84c6-4064-ae0a-964f4a311e86-kube-api-access-qvj7j\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.548130 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82727253-41e6-49e0-8175-6af707b258e5-serving-cert\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.548159 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/424265a9-84c6-4064-ae0a-964f4a311e86-config\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.548190 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-config\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649628 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82727253-41e6-49e0-8175-6af707b258e5-serving-cert\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649690 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/424265a9-84c6-4064-ae0a-964f4a311e86-config\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649714 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-config\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649787 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf2zc\" (UniqueName: \"kubernetes.io/projected/82727253-41e6-49e0-8175-6af707b258e5-kube-api-access-hf2zc\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649812 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/424265a9-84c6-4064-ae0a-964f4a311e86-client-ca\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649844 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-proxy-ca-bundles\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649865 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-client-ca\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649912 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/424265a9-84c6-4064-ae0a-964f4a311e86-serving-cert\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.649943 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvj7j\" (UniqueName: \"kubernetes.io/projected/424265a9-84c6-4064-ae0a-964f4a311e86-kube-api-access-qvj7j\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.650980 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-client-ca\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.650994 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/424265a9-84c6-4064-ae0a-964f4a311e86-client-ca\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.651161 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/424265a9-84c6-4064-ae0a-964f4a311e86-config\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.651206 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-proxy-ca-bundles\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.652097 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82727253-41e6-49e0-8175-6af707b258e5-config\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.655945 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/424265a9-84c6-4064-ae0a-964f4a311e86-serving-cert\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.659700 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82727253-41e6-49e0-8175-6af707b258e5-serving-cert\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.667674 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf2zc\" (UniqueName: \"kubernetes.io/projected/82727253-41e6-49e0-8175-6af707b258e5-kube-api-access-hf2zc\") pod \"controller-manager-58b48768f9-ldxwl\" (UID: \"82727253-41e6-49e0-8175-6af707b258e5\") " pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.671892 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvj7j\" (UniqueName: \"kubernetes.io/projected/424265a9-84c6-4064-ae0a-964f4a311e86-kube-api-access-qvj7j\") pod \"route-controller-manager-857b95fc8-rg8x4\" (UID: \"424265a9-84c6-4064-ae0a-964f4a311e86\") " pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.780462 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:06 crc kubenswrapper[4779]: I0320 15:28:06.786671 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:07 crc kubenswrapper[4779]: I0320 15:28:07.175655 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58b48768f9-ldxwl"] Mar 20 15:28:07 crc kubenswrapper[4779]: W0320 15:28:07.178377 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82727253_41e6_49e0_8175_6af707b258e5.slice/crio-1df9e75540141df8907688451df27ddadefa14583daf03c46648f2d71ea63576 WatchSource:0}: Error finding container 1df9e75540141df8907688451df27ddadefa14583daf03c46648f2d71ea63576: Status 404 returned error can't find the container with id 1df9e75540141df8907688451df27ddadefa14583daf03c46648f2d71ea63576 Mar 20 15:28:07 crc kubenswrapper[4779]: I0320 15:28:07.218548 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4"] Mar 20 15:28:07 crc kubenswrapper[4779]: W0320 15:28:07.238154 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod424265a9_84c6_4064_ae0a_964f4a311e86.slice/crio-2f6ac3f20a85e9c8af1c47115ebcc243f2d580dc2f836367598cfba18fd53cd7 WatchSource:0}: Error finding container 2f6ac3f20a85e9c8af1c47115ebcc243f2d580dc2f836367598cfba18fd53cd7: Status 404 returned error can't find the container with id 2f6ac3f20a85e9c8af1c47115ebcc243f2d580dc2f836367598cfba18fd53cd7 Mar 20 15:28:07 crc kubenswrapper[4779]: I0320 15:28:07.813944 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1490252c-df79-4687-a24f-9905c70efa9f" path="/var/lib/kubelet/pods/1490252c-df79-4687-a24f-9905c70efa9f/volumes" Mar 20 15:28:07 crc kubenswrapper[4779]: I0320 15:28:07.814975 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e74ef8b-bcb6-4a39-a2b0-892343298556" path="/var/lib/kubelet/pods/8e74ef8b-bcb6-4a39-a2b0-892343298556/volumes" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.034737 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" event={"ID":"82727253-41e6-49e0-8175-6af707b258e5","Type":"ContainerStarted","Data":"93a60a2390d21fe62839beb7cbb32b3d732d343b13438dd557ab36081141f2aa"} Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.034789 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" event={"ID":"82727253-41e6-49e0-8175-6af707b258e5","Type":"ContainerStarted","Data":"1df9e75540141df8907688451df27ddadefa14583daf03c46648f2d71ea63576"} Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.034986 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.036180 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" event={"ID":"424265a9-84c6-4064-ae0a-964f4a311e86","Type":"ContainerStarted","Data":"5159f2d932e20c9250e115a28a4494f0468380a3672dec8cbc236eeb79c84f43"} Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.036223 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" event={"ID":"424265a9-84c6-4064-ae0a-964f4a311e86","Type":"ContainerStarted","Data":"2f6ac3f20a85e9c8af1c47115ebcc243f2d580dc2f836367598cfba18fd53cd7"} Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.036239 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.039803 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.040836 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.054601 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58b48768f9-ldxwl" podStartSLOduration=4.054581096 podStartE2EDuration="4.054581096s" podCreationTimestamp="2026-03-20 15:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:28:08.050071813 +0000 UTC m=+305.012587603" watchObservedRunningTime="2026-03-20 15:28:08.054581096 +0000 UTC m=+305.017096896" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.091752 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-857b95fc8-rg8x4" podStartSLOduration=3.091737725 podStartE2EDuration="3.091737725s" podCreationTimestamp="2026-03-20 15:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:28:08.091461447 +0000 UTC m=+305.053977247" watchObservedRunningTime="2026-03-20 15:28:08.091737725 +0000 UTC m=+305.054253525" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.457867 4779 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.458534 4779 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.458713 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.458805 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540" gracePeriod=15 Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.458944 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8" gracePeriod=15 Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.458971 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890" gracePeriod=15 Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.458964 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577" gracePeriod=15 Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.459276 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5" gracePeriod=15 Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460032 4779 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460209 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460222 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460235 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460243 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460252 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460260 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460272 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460279 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460288 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460295 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460306 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460313 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460321 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460328 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460337 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460380 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460392 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460400 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460514 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460530 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460542 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460551 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460560 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460570 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460583 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.460705 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460716 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460832 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.460843 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.476008 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.477079 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.478249 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.478292 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.478316 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.478448 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.478507 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.478548 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.579605 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.579977 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.579979 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.579816 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580428 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580532 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580633 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580639 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580712 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580768 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580818 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580858 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580925 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580964 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.580993 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.581181 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.952823 4779 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.953596 4779 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.954036 4779 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.954678 4779 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.955073 4779 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:08 crc kubenswrapper[4779]: I0320 15:28:08.955120 4779 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 15:28:08 crc kubenswrapper[4779]: E0320 15:28:08.955366 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="200ms" Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.045008 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.047362 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.048096 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5" exitCode=0 Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.048146 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577" exitCode=0 Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.048158 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8" exitCode=0 Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.048166 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890" exitCode=2 Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.048226 4779 scope.go:117] "RemoveContainer" containerID="8a78b2fb29042e4c5d0a232f129e39d29dd98c0cf9140c7770b33d6db7a60fe9" Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.050227 4779 generic.go:334] "Generic (PLEG): container finished" podID="2dc621f5-0ad3-4351-bd37-082c3846b156" containerID="4a9ae52f372c15e8e6f139c1707e2f2b6452a2cbb7ae24615481947dcf88952f" exitCode=0 Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.050274 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2dc621f5-0ad3-4351-bd37-082c3846b156","Type":"ContainerDied","Data":"4a9ae52f372c15e8e6f139c1707e2f2b6452a2cbb7ae24615481947dcf88952f"} Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.051150 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:09 crc kubenswrapper[4779]: I0320 15:28:09.051584 4779 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:09 crc kubenswrapper[4779]: E0320 15:28:09.156168 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="400ms" Mar 20 15:28:09 crc kubenswrapper[4779]: E0320 15:28:09.557583 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="800ms" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.059511 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 15:28:10 crc kubenswrapper[4779]: E0320 15:28:10.359178 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="1.6s" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.402093 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.402851 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.602855 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc621f5-0ad3-4351-bd37-082c3846b156-kube-api-access\") pod \"2dc621f5-0ad3-4351-bd37-082c3846b156\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.602911 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-kubelet-dir\") pod \"2dc621f5-0ad3-4351-bd37-082c3846b156\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.603023 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-var-lock\") pod \"2dc621f5-0ad3-4351-bd37-082c3846b156\" (UID: \"2dc621f5-0ad3-4351-bd37-082c3846b156\") " Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.603139 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2dc621f5-0ad3-4351-bd37-082c3846b156" (UID: "2dc621f5-0ad3-4351-bd37-082c3846b156"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.603167 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-var-lock" (OuterVolumeSpecName: "var-lock") pod "2dc621f5-0ad3-4351-bd37-082c3846b156" (UID: "2dc621f5-0ad3-4351-bd37-082c3846b156"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.603388 4779 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.603411 4779 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc621f5-0ad3-4351-bd37-082c3846b156-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.609340 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc621f5-0ad3-4351-bd37-082c3846b156-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2dc621f5-0ad3-4351-bd37-082c3846b156" (UID: "2dc621f5-0ad3-4351-bd37-082c3846b156"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.704654 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc621f5-0ad3-4351-bd37-082c3846b156-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.813799 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.814564 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.815176 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:10 crc kubenswrapper[4779]: I0320 15:28:10.815638 4779 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:10 crc kubenswrapper[4779]: E0320 15:28:10.906949 4779 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.219:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" volumeName="registry-storage" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.008727 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.008794 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.008878 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.008874 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.008889 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.008984 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.009222 4779 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.009238 4779 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.009247 4779 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.068169 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2dc621f5-0ad3-4351-bd37-082c3846b156","Type":"ContainerDied","Data":"7f6078875623f53ee11f377737a4bcf5843ca13df1ff1298e735300619999602"} Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.068219 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f6078875623f53ee11f377737a4bcf5843ca13df1ff1298e735300619999602" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.068193 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.070901 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.071607 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540" exitCode=0 Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.071663 4779 scope.go:117] "RemoveContainer" containerID="5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.071772 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.088943 4779 scope.go:117] "RemoveContainer" containerID="cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.088943 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.091360 4779 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.093267 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.094005 4779 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.099836 4779 scope.go:117] "RemoveContainer" containerID="21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.110471 4779 scope.go:117] "RemoveContainer" containerID="d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.122323 4779 scope.go:117] "RemoveContainer" containerID="315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.136080 4779 scope.go:117] "RemoveContainer" containerID="cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.150099 4779 scope.go:117] "RemoveContainer" containerID="5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5" Mar 20 15:28:11 crc kubenswrapper[4779]: E0320 15:28:11.150523 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\": container with ID starting with 5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5 not found: ID does not exist" containerID="5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.150560 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5"} err="failed to get container status \"5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\": rpc error: code = NotFound desc = could not find container \"5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5\": container with ID starting with 5a91684c6a1a65d270c4c57f78c24f285f72f56b52857058a1726329853f3eb5 not found: ID does not exist" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.150585 4779 scope.go:117] "RemoveContainer" containerID="cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577" Mar 20 15:28:11 crc kubenswrapper[4779]: E0320 15:28:11.150832 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\": container with ID starting with cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577 not found: ID does not exist" containerID="cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.150931 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577"} err="failed to get container status \"cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\": rpc error: code = NotFound desc = could not find container \"cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577\": container with ID starting with cda00bc6e01561343640b6d9ee2a3f025339b952712b7070408d6dec6c4d5577 not found: ID does not exist" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.151070 4779 scope.go:117] "RemoveContainer" containerID="21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8" Mar 20 15:28:11 crc kubenswrapper[4779]: E0320 15:28:11.151455 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\": container with ID starting with 21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8 not found: ID does not exist" containerID="21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.151480 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8"} err="failed to get container status \"21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\": rpc error: code = NotFound desc = could not find container \"21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8\": container with ID starting with 21e850b1ab3720a2916973256bec5f5587ccd0e9c876ad3704433f42b224c5d8 not found: ID does not exist" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.151493 4779 scope.go:117] "RemoveContainer" containerID="d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890" Mar 20 15:28:11 crc kubenswrapper[4779]: E0320 15:28:11.151683 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\": container with ID starting with d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890 not found: ID does not exist" containerID="d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.151764 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890"} err="failed to get container status \"d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\": rpc error: code = NotFound desc = could not find container \"d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890\": container with ID starting with d9afa9050abc63b3ffe9c0013fa2e63310536997e0b9fda07d28b92ba3bb3890 not found: ID does not exist" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.151833 4779 scope.go:117] "RemoveContainer" containerID="315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540" Mar 20 15:28:11 crc kubenswrapper[4779]: E0320 15:28:11.152148 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\": container with ID starting with 315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540 not found: ID does not exist" containerID="315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.152222 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540"} err="failed to get container status \"315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\": rpc error: code = NotFound desc = could not find container \"315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540\": container with ID starting with 315bacd467728dfe637c99a292b0872c248a15551ecdcaca190e2ce531469540 not found: ID does not exist" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.152284 4779 scope.go:117] "RemoveContainer" containerID="cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059" Mar 20 15:28:11 crc kubenswrapper[4779]: E0320 15:28:11.152538 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\": container with ID starting with cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059 not found: ID does not exist" containerID="cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.152563 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059"} err="failed to get container status \"cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\": rpc error: code = NotFound desc = could not find container \"cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059\": container with ID starting with cb8e70d2cdf35565c7aea36e4031204e37116ce14cf4385a0bb480dd49737059 not found: ID does not exist" Mar 20 15:28:11 crc kubenswrapper[4779]: I0320 15:28:11.815479 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 15:28:11 crc kubenswrapper[4779]: E0320 15:28:11.960597 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="3.2s" Mar 20 15:28:13 crc kubenswrapper[4779]: E0320 15:28:13.499282 4779 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.219:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:13 crc kubenswrapper[4779]: I0320 15:28:13.499698 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:13 crc kubenswrapper[4779]: E0320 15:28:13.521313 4779 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.219:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e963d5c8118d2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:28:13.519689938 +0000 UTC m=+310.482205758,LastTimestamp:2026-03-20 15:28:13.519689938 +0000 UTC m=+310.482205758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:28:13 crc kubenswrapper[4779]: I0320 15:28:13.811236 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:14 crc kubenswrapper[4779]: I0320 15:28:14.092026 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367"} Mar 20 15:28:14 crc kubenswrapper[4779]: I0320 15:28:14.092099 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8fe360fe9b2f82f2cc0950e07875933aaf8e0b8d17d59a93148966fa4af4194e"} Mar 20 15:28:14 crc kubenswrapper[4779]: E0320 15:28:14.092841 4779 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.219:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:14 crc kubenswrapper[4779]: I0320 15:28:14.092851 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:15 crc kubenswrapper[4779]: E0320 15:28:15.162752 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="6.4s" Mar 20 15:28:17 crc kubenswrapper[4779]: E0320 15:28:17.392425 4779 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.219:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e963d5c8118d2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:28:13.519689938 +0000 UTC m=+310.482205758,LastTimestamp:2026-03-20 15:28:13.519689938 +0000 UTC m=+310.482205758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 15:28:21 crc kubenswrapper[4779]: E0320 15:28:21.563690 4779 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.219:6443: connect: connection refused" interval="7s" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.140424 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.141432 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.141489 4779 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6" exitCode=1 Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.141526 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6"} Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.142028 4779 scope.go:117] "RemoveContainer" containerID="d1693a9d291973037577c1ddbdce083c7aa03b895f7260b45231b1563e1a97d6" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.142633 4779 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.143071 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.808145 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.809073 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.809418 4779 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.821169 4779 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.821214 4779 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:22 crc kubenswrapper[4779]: E0320 15:28:22.821838 4779 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:22 crc kubenswrapper[4779]: I0320 15:28:22.822594 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.149808 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.150544 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.150680 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aaeeeb75d3c3f19a3cbf0c05b325bc8388219984f11372147ba1c033bb6ef1e3"} Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.152273 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.152993 4779 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.153889 4779 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="08aad68534672f7b5b017133c592b786406ebabd71696836b16f50d7f82b5023" exitCode=0 Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.153945 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"08aad68534672f7b5b017133c592b786406ebabd71696836b16f50d7f82b5023"} Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.154009 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d5897ad550ed5fe4958fc0273e30254329e6cbc84628823a825f3b2e7a7978fe"} Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.154431 4779 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.154455 4779 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.154833 4779 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:23 crc kubenswrapper[4779]: E0320 15:28:23.154880 4779 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.155318 4779 status_manager.go:851] "Failed to get status for pod" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.219:6443: connect: connection refused" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.980056 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.980312 4779 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 15:28:23 crc kubenswrapper[4779]: I0320 15:28:23.980952 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 15:28:24 crc kubenswrapper[4779]: I0320 15:28:24.177578 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e35847d3eaabebd416d2ba7806a0c28e4ae656c36b616e48ed9b9397b443954"} Mar 20 15:28:24 crc kubenswrapper[4779]: I0320 15:28:24.177877 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c4c50f455e569f9915b7d41dbab272ddbc70461d127801261efea553c65feca"} Mar 20 15:28:24 crc kubenswrapper[4779]: I0320 15:28:24.177975 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"13a012fa8184df55bd225734dc440600c192eb52c0abf3d7d4eb33a186a35993"} Mar 20 15:28:24 crc kubenswrapper[4779]: I0320 15:28:24.178056 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"18bb8843a3370d289dcb1177315cc5ea801abd4832741dc46e7e4e25846277c7"} Mar 20 15:28:25 crc kubenswrapper[4779]: I0320 15:28:25.186332 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5d435b8a6a9490187d67359d9562af19aee3786f49257ff90efb31bdf0677bd5"} Mar 20 15:28:25 crc kubenswrapper[4779]: I0320 15:28:25.186637 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:25 crc kubenswrapper[4779]: I0320 15:28:25.186549 4779 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:25 crc kubenswrapper[4779]: I0320 15:28:25.186659 4779 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:27 crc kubenswrapper[4779]: I0320 15:28:27.823561 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:27 crc kubenswrapper[4779]: I0320 15:28:27.824527 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:27 crc kubenswrapper[4779]: I0320 15:28:27.829027 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:28 crc kubenswrapper[4779]: I0320 15:28:28.436178 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:28:30 crc kubenswrapper[4779]: I0320 15:28:30.198180 4779 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:31 crc kubenswrapper[4779]: I0320 15:28:31.213685 4779 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:31 crc kubenswrapper[4779]: I0320 15:28:31.213716 4779 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:31 crc kubenswrapper[4779]: I0320 15:28:31.217333 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:31 crc kubenswrapper[4779]: I0320 15:28:31.220115 4779 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="366542b9-b06a-48b5-8bfd-b0ccf1f01521" Mar 20 15:28:32 crc kubenswrapper[4779]: I0320 15:28:32.217828 4779 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:32 crc kubenswrapper[4779]: I0320 15:28:32.217856 4779 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2042bcde-ea1f-4477-a4c2-d4f81621a660" Mar 20 15:28:33 crc kubenswrapper[4779]: I0320 15:28:33.820099 4779 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="366542b9-b06a-48b5-8bfd-b0ccf1f01521" Mar 20 15:28:33 crc kubenswrapper[4779]: I0320 15:28:33.983359 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:28:33 crc kubenswrapper[4779]: I0320 15:28:33.987610 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.911351 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.911462 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.911562 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.913473 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.913654 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.913654 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.923041 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.923517 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.927710 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.930589 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 15:28:34 crc kubenswrapper[4779]: I0320 15:28:34.935424 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.013011 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.013295 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.016623 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.021088 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.028681 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44bdd151-2a1e-4f14-a095-81b541307138-metrics-certs\") pod \"network-metrics-daemon-l4gtx\" (UID: \"44bdd151-2a1e-4f14-a095-81b541307138\") " pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.138495 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.147840 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.155933 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l4gtx" Mar 20 15:28:35 crc kubenswrapper[4779]: I0320 15:28:35.225382 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:28:35 crc kubenswrapper[4779]: W0320 15:28:35.337840 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-60b1506941081f4cd90db52c8682b738f396bd87ea49fb76b0c51def1364a28f WatchSource:0}: Error finding container 60b1506941081f4cd90db52c8682b738f396bd87ea49fb76b0c51def1364a28f: Status 404 returned error can't find the container with id 60b1506941081f4cd90db52c8682b738f396bd87ea49fb76b0c51def1364a28f Mar 20 15:28:35 crc kubenswrapper[4779]: W0320 15:28:35.525755 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-e4ff98cf4f26e3e7b47cc0aacd0cbc288581853340ea3e7261aebf003d9e0bce WatchSource:0}: Error finding container e4ff98cf4f26e3e7b47cc0aacd0cbc288581853340ea3e7261aebf003d9e0bce: Status 404 returned error can't find the container with id e4ff98cf4f26e3e7b47cc0aacd0cbc288581853340ea3e7261aebf003d9e0bce Mar 20 15:28:35 crc kubenswrapper[4779]: W0320 15:28:35.600037 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44bdd151_2a1e_4f14_a095_81b541307138.slice/crio-3fd185637cb62ea0caba8f292c1b9515230c88ef75c35e32afc1ecc8dc6b5904 WatchSource:0}: Error finding container 3fd185637cb62ea0caba8f292c1b9515230c88ef75c35e32afc1ecc8dc6b5904: Status 404 returned error can't find the container with id 3fd185637cb62ea0caba8f292c1b9515230c88ef75c35e32afc1ecc8dc6b5904 Mar 20 15:28:35 crc kubenswrapper[4779]: W0320 15:28:35.670543 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-99a01d3a3223fac745c39599a0af131a9d6835569c27c627b94249a334ce6e55 WatchSource:0}: Error finding container 99a01d3a3223fac745c39599a0af131a9d6835569c27c627b94249a334ce6e55: Status 404 returned error can't find the container with id 99a01d3a3223fac745c39599a0af131a9d6835569c27c627b94249a334ce6e55 Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.245567 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f083395cf84bfa8e6fb178afb8566b219603c6cb1696df9a3545854c7fb6b7b1"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.245969 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"60b1506941081f4cd90db52c8682b738f396bd87ea49fb76b0c51def1364a28f"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.248188 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" event={"ID":"44bdd151-2a1e-4f14-a095-81b541307138","Type":"ContainerStarted","Data":"f24b4d49a335caf723845324cbbab21ee5a3e173fb1f99940bd031fe2d61dc5a"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.248215 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" event={"ID":"44bdd151-2a1e-4f14-a095-81b541307138","Type":"ContainerStarted","Data":"b95dcf63cdec9bf57b4689941861066f0a3fc685ca95f3b414b38ebf4de768d3"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.248227 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l4gtx" event={"ID":"44bdd151-2a1e-4f14-a095-81b541307138","Type":"ContainerStarted","Data":"3fd185637cb62ea0caba8f292c1b9515230c88ef75c35e32afc1ecc8dc6b5904"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.249548 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2594aa351defddfa3ad3f64007d90bc89f5ecc4f7a75d3d314b088f41738a4c9"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.249573 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e4ff98cf4f26e3e7b47cc0aacd0cbc288581853340ea3e7261aebf003d9e0bce"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.250747 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2381bbe4065996cddb172f93b4a3dc7687ec54d056f7da84259a9abba2332989"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.250773 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"99a01d3a3223fac745c39599a0af131a9d6835569c27c627b94249a334ce6e55"} Mar 20 15:28:36 crc kubenswrapper[4779]: I0320 15:28:36.251119 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:28:37 crc kubenswrapper[4779]: I0320 15:28:37.257844 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 15:28:37 crc kubenswrapper[4779]: I0320 15:28:37.258159 4779 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="2594aa351defddfa3ad3f64007d90bc89f5ecc4f7a75d3d314b088f41738a4c9" exitCode=255 Mar 20 15:28:37 crc kubenswrapper[4779]: I0320 15:28:37.258307 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"2594aa351defddfa3ad3f64007d90bc89f5ecc4f7a75d3d314b088f41738a4c9"} Mar 20 15:28:37 crc kubenswrapper[4779]: I0320 15:28:37.259021 4779 scope.go:117] "RemoveContainer" containerID="2594aa351defddfa3ad3f64007d90bc89f5ecc4f7a75d3d314b088f41738a4c9" Mar 20 15:28:38 crc kubenswrapper[4779]: I0320 15:28:38.264833 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 15:28:38 crc kubenswrapper[4779]: I0320 15:28:38.265689 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 15:28:38 crc kubenswrapper[4779]: I0320 15:28:38.266059 4779 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="5891d7cdf7690712e20ee975ac09a7ba6d65902da018bb82fa30eeb5bb3ba4a9" exitCode=255 Mar 20 15:28:38 crc kubenswrapper[4779]: I0320 15:28:38.266138 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"5891d7cdf7690712e20ee975ac09a7ba6d65902da018bb82fa30eeb5bb3ba4a9"} Mar 20 15:28:38 crc kubenswrapper[4779]: I0320 15:28:38.266184 4779 scope.go:117] "RemoveContainer" containerID="2594aa351defddfa3ad3f64007d90bc89f5ecc4f7a75d3d314b088f41738a4c9" Mar 20 15:28:38 crc kubenswrapper[4779]: I0320 15:28:38.266898 4779 scope.go:117] "RemoveContainer" containerID="5891d7cdf7690712e20ee975ac09a7ba6d65902da018bb82fa30eeb5bb3ba4a9" Mar 20 15:28:38 crc kubenswrapper[4779]: E0320 15:28:38.267203 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 15:28:39 crc kubenswrapper[4779]: I0320 15:28:39.275822 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 15:28:39 crc kubenswrapper[4779]: I0320 15:28:39.800036 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 15:28:40 crc kubenswrapper[4779]: I0320 15:28:40.098807 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 15:28:40 crc kubenswrapper[4779]: I0320 15:28:40.584143 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 15:28:41 crc kubenswrapper[4779]: I0320 15:28:41.025596 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 15:28:41 crc kubenswrapper[4779]: I0320 15:28:41.195621 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 15:28:41 crc kubenswrapper[4779]: I0320 15:28:41.250996 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 15:28:41 crc kubenswrapper[4779]: I0320 15:28:41.504289 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 15:28:41 crc kubenswrapper[4779]: I0320 15:28:41.614427 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 15:28:41 crc kubenswrapper[4779]: I0320 15:28:41.882889 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.055364 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.128353 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.232698 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.288577 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.562972 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.584783 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.688020 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.771044 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.781694 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.819700 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.880528 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 15:28:42 crc kubenswrapper[4779]: I0320 15:28:42.933738 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.040434 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.101992 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.145552 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.197136 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.252163 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.384040 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.420316 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.447532 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.527631 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.601100 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.628809 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.676626 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.730975 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.806505 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.823971 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.843654 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.870276 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.893745 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.907003 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 15:28:43 crc kubenswrapper[4779]: I0320 15:28:43.958882 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.068591 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.082670 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.117771 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.169772 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.232831 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.259815 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.291096 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.319024 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.359249 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.384317 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.417628 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.586443 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.739170 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.960153 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 15:28:44 crc kubenswrapper[4779]: I0320 15:28:44.974887 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.013501 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.046078 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.066873 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.316428 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.379924 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.388196 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.505319 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.541483 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.556648 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.612176 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.735474 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.752525 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.787789 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.787925 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 15:28:45 crc kubenswrapper[4779]: I0320 15:28:45.860753 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.055168 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.087057 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.130726 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.174055 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.187087 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.369467 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.372909 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.407851 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.559302 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.559501 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.578954 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.645633 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.652992 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.743925 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 15:28:46 crc kubenswrapper[4779]: I0320 15:28:46.884757 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.094668 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.142966 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.165343 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.170092 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.275958 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.287367 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.296405 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.308459 4779 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.380808 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.383450 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.409562 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.446798 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.452795 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.525313 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.534718 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.548851 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.600943 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.601753 4779 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.637589 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.756866 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.780989 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.851438 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.851479 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 15:28:47 crc kubenswrapper[4779]: I0320 15:28:47.999083 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.014176 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.208609 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.239986 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.385385 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.387487 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.428256 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.463792 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.534143 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.607726 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.686615 4779 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.703295 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.703512 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.728490 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.769035 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.788945 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.789362 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.881453 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 15:28:48 crc kubenswrapper[4779]: I0320 15:28:48.942092 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.026679 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.061352 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.213222 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.233307 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.270298 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.346486 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.433950 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.456821 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.483423 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.501181 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.525477 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.644298 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.647588 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.683551 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.784244 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.843078 4779 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.909920 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.918307 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 15:28:49 crc kubenswrapper[4779]: I0320 15:28:49.973874 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.164561 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.199038 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.242960 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.265952 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.267292 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.289925 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.304212 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.314512 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.352481 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.391909 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.395648 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.504051 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.567521 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.686668 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.714798 4779 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.759340 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.832967 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.848132 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.857372 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.926368 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 15:28:50 crc kubenswrapper[4779]: I0320 15:28:50.958398 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.003198 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.055933 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.142000 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.224639 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.274364 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.337636 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.376430 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.411890 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.502475 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.606759 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.704660 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.724529 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.881931 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.887428 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.996387 4779 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 15:28:51 crc kubenswrapper[4779]: I0320 15:28:51.997176 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l4gtx" podStartSLOduration=302.997155113 podStartE2EDuration="5m2.997155113s" podCreationTimestamp="2026-03-20 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:28:36.276717622 +0000 UTC m=+333.239233422" watchObservedRunningTime="2026-03-20 15:28:51.997155113 +0000 UTC m=+348.959670933" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.001005 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.001051 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.001069 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l4gtx"] Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.005261 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.013524 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.019296 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.019276866 podStartE2EDuration="22.019276866s" podCreationTimestamp="2026-03-20 15:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:28:52.016165077 +0000 UTC m=+348.978680897" watchObservedRunningTime="2026-03-20 15:28:52.019276866 +0000 UTC m=+348.981792666" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.158090 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.167624 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.257513 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.298149 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.330449 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.368139 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.395573 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.444033 4779 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.444319 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367" gracePeriod=5 Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.545363 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.604664 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.611397 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.612801 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.640885 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.675787 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.684800 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.688942 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.696480 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.755473 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.755937 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.804719 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.908729 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.924991 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.940542 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.965884 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 15:28:52 crc kubenswrapper[4779]: I0320 15:28:52.968297 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.004018 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.012165 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.029246 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.166094 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.167304 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.168386 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.194159 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.267323 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.271680 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.290754 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.770650 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.813530 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.814641 4779 scope.go:117] "RemoveContainer" containerID="5891d7cdf7690712e20ee975ac09a7ba6d65902da018bb82fa30eeb5bb3ba4a9" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.829582 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.894492 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 15:28:53 crc kubenswrapper[4779]: I0320 15:28:53.901255 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.037142 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.142423 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.225776 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.335741 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.366450 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.366500 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"660ef097f830d51a27a576f9d4aa269b1300306f05c3704cda6b4a548b01c7c5"} Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.392773 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.439169 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.448129 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.455250 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.673611 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.788421 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.810063 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.837727 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.935456 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 15:28:54 crc kubenswrapper[4779]: I0320 15:28:54.959767 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 15:28:55 crc kubenswrapper[4779]: I0320 15:28:55.053606 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 15:28:55 crc kubenswrapper[4779]: I0320 15:28:55.205592 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 15:28:55 crc kubenswrapper[4779]: I0320 15:28:55.830394 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 15:28:55 crc kubenswrapper[4779]: I0320 15:28:55.832301 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 15:28:56 crc kubenswrapper[4779]: I0320 15:28:56.083048 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 15:28:56 crc kubenswrapper[4779]: I0320 15:28:56.489939 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 15:28:56 crc kubenswrapper[4779]: I0320 15:28:56.525378 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 15:28:56 crc kubenswrapper[4779]: I0320 15:28:56.933409 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 15:28:57 crc kubenswrapper[4779]: I0320 15:28:57.783366 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.011982 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.012056 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103655 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103721 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103741 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103828 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103870 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103892 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103874 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103913 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.103996 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.104330 4779 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.104350 4779 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.104360 4779 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.104368 4779 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.110841 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.205131 4779 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.386760 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.386814 4779 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367" exitCode=137 Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.386862 4779 scope.go:117] "RemoveContainer" containerID="f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.386892 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.405045 4779 scope.go:117] "RemoveContainer" containerID="f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367" Mar 20 15:28:58 crc kubenswrapper[4779]: E0320 15:28:58.405486 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367\": container with ID starting with f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367 not found: ID does not exist" containerID="f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367" Mar 20 15:28:58 crc kubenswrapper[4779]: I0320 15:28:58.405518 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367"} err="failed to get container status \"f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367\": rpc error: code = NotFound desc = could not find container \"f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367\": container with ID starting with f5273d292f1d71cf565f86ec244d442cfb22c9c6dbb6ed5a57ad3932ddb0a367 not found: ID does not exist" Mar 20 15:28:59 crc kubenswrapper[4779]: I0320 15:28:59.815486 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 15:29:05 crc kubenswrapper[4779]: I0320 15:29:05.234909 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.400884 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vjc2r"] Mar 20 15:29:40 crc kubenswrapper[4779]: E0320 15:29:40.401642 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" containerName="installer" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.401654 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" containerName="installer" Mar 20 15:29:40 crc kubenswrapper[4779]: E0320 15:29:40.401665 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.401670 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.401750 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.401762 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc621f5-0ad3-4351-bd37-082c3846b156" containerName="installer" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.402096 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.412301 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vjc2r"] Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.575382 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a73e351-ee6f-472f-8964-ebcb7de3ca15-trusted-ca\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.575801 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.575833 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1a73e351-ee6f-472f-8964-ebcb7de3ca15-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.575860 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-bound-sa-token\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.575911 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-registry-tls\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.575933 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1a73e351-ee6f-472f-8964-ebcb7de3ca15-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.575976 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5898\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-kube-api-access-l5898\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.575998 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1a73e351-ee6f-472f-8964-ebcb7de3ca15-registry-certificates\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.609619 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.680496 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-registry-tls\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.680584 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1a73e351-ee6f-472f-8964-ebcb7de3ca15-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.680632 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5898\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-kube-api-access-l5898\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.680669 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1a73e351-ee6f-472f-8964-ebcb7de3ca15-registry-certificates\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.680705 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a73e351-ee6f-472f-8964-ebcb7de3ca15-trusted-ca\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.680777 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1a73e351-ee6f-472f-8964-ebcb7de3ca15-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.680814 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-bound-sa-token\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.683030 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1a73e351-ee6f-472f-8964-ebcb7de3ca15-registry-certificates\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.683037 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1a73e351-ee6f-472f-8964-ebcb7de3ca15-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.683318 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a73e351-ee6f-472f-8964-ebcb7de3ca15-trusted-ca\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.701258 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-registry-tls\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.701715 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1a73e351-ee6f-472f-8964-ebcb7de3ca15-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.703348 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-bound-sa-token\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.703786 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5898\" (UniqueName: \"kubernetes.io/projected/1a73e351-ee6f-472f-8964-ebcb7de3ca15-kube-api-access-l5898\") pod \"image-registry-66df7c8f76-vjc2r\" (UID: \"1a73e351-ee6f-472f-8964-ebcb7de3ca15\") " pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:40 crc kubenswrapper[4779]: I0320 15:29:40.716578 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:41 crc kubenswrapper[4779]: I0320 15:29:41.089714 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vjc2r"] Mar 20 15:29:41 crc kubenswrapper[4779]: I0320 15:29:41.630420 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" event={"ID":"1a73e351-ee6f-472f-8964-ebcb7de3ca15","Type":"ContainerStarted","Data":"4de4b5e78ea5a821c733d7d82a030e6d1d820018f263da93d7654b1efc8da5bd"} Mar 20 15:29:41 crc kubenswrapper[4779]: I0320 15:29:41.630471 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" event={"ID":"1a73e351-ee6f-472f-8964-ebcb7de3ca15","Type":"ContainerStarted","Data":"22aa67b48421e57e7226e360c1b67359dfb2d01236c8fcd4e05b252ad0625fdb"} Mar 20 15:29:41 crc kubenswrapper[4779]: I0320 15:29:41.631305 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:29:41 crc kubenswrapper[4779]: I0320 15:29:41.839987 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" podStartSLOduration=1.8399682240000002 podStartE2EDuration="1.839968224s" podCreationTimestamp="2026-03-20 15:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:29:41.835726747 +0000 UTC m=+398.798242557" watchObservedRunningTime="2026-03-20 15:29:41.839968224 +0000 UTC m=+398.802484024" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.164505 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567010-hqddk"] Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.165960 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567010-hqddk" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.167441 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw"] Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.168347 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.168693 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.171587 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.173440 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.173589 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.183032 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.186624 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567010-hqddk"] Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.195555 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw"] Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.253273 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4lmk\" (UniqueName: \"kubernetes.io/projected/414c34b6-2c97-4039-8863-a2b245fa0316-kube-api-access-r4lmk\") pod \"auto-csr-approver-29567010-hqddk\" (UID: \"414c34b6-2c97-4039-8863-a2b245fa0316\") " pod="openshift-infra/auto-csr-approver-29567010-hqddk" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.253312 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4981969-176f-47c6-b265-df9f5668838b-config-volume\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.253358 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzn7r\" (UniqueName: \"kubernetes.io/projected/d4981969-176f-47c6-b265-df9f5668838b-kube-api-access-lzn7r\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.253387 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4981969-176f-47c6-b265-df9f5668838b-secret-volume\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.354273 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4lmk\" (UniqueName: \"kubernetes.io/projected/414c34b6-2c97-4039-8863-a2b245fa0316-kube-api-access-r4lmk\") pod \"auto-csr-approver-29567010-hqddk\" (UID: \"414c34b6-2c97-4039-8863-a2b245fa0316\") " pod="openshift-infra/auto-csr-approver-29567010-hqddk" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.354320 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4981969-176f-47c6-b265-df9f5668838b-config-volume\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.354359 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzn7r\" (UniqueName: \"kubernetes.io/projected/d4981969-176f-47c6-b265-df9f5668838b-kube-api-access-lzn7r\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.354389 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4981969-176f-47c6-b265-df9f5668838b-secret-volume\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.356426 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4981969-176f-47c6-b265-df9f5668838b-config-volume\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.365930 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4981969-176f-47c6-b265-df9f5668838b-secret-volume\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.371234 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzn7r\" (UniqueName: \"kubernetes.io/projected/d4981969-176f-47c6-b265-df9f5668838b-kube-api-access-lzn7r\") pod \"collect-profiles-29567010-9g8lw\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.377459 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4lmk\" (UniqueName: \"kubernetes.io/projected/414c34b6-2c97-4039-8863-a2b245fa0316-kube-api-access-r4lmk\") pod \"auto-csr-approver-29567010-hqddk\" (UID: \"414c34b6-2c97-4039-8863-a2b245fa0316\") " pod="openshift-infra/auto-csr-approver-29567010-hqddk" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.501173 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567010-hqddk" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.507242 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.722723 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vjc2r" Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.772147 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tx4fd"] Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.882665 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567010-hqddk"] Mar 20 15:30:00 crc kubenswrapper[4779]: W0320 15:30:00.888146 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod414c34b6_2c97_4039_8863_a2b245fa0316.slice/crio-9d79a6d85d46adb61480052cafac2876d6396864900a0c9dbb20dd10fb6baa80 WatchSource:0}: Error finding container 9d79a6d85d46adb61480052cafac2876d6396864900a0c9dbb20dd10fb6baa80: Status 404 returned error can't find the container with id 9d79a6d85d46adb61480052cafac2876d6396864900a0c9dbb20dd10fb6baa80 Mar 20 15:30:00 crc kubenswrapper[4779]: I0320 15:30:00.919614 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw"] Mar 20 15:30:00 crc kubenswrapper[4779]: W0320 15:30:00.924941 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4981969_176f_47c6_b265_df9f5668838b.slice/crio-bc65fc9de1370218b107faee3fe24aab122d967fa59152b4680e9e43bbbdabae WatchSource:0}: Error finding container bc65fc9de1370218b107faee3fe24aab122d967fa59152b4680e9e43bbbdabae: Status 404 returned error can't find the container with id bc65fc9de1370218b107faee3fe24aab122d967fa59152b4680e9e43bbbdabae Mar 20 15:30:01 crc kubenswrapper[4779]: I0320 15:30:01.728550 4779 generic.go:334] "Generic (PLEG): container finished" podID="d4981969-176f-47c6-b265-df9f5668838b" containerID="7e78b296ba2332d79bf0e80128d52c070bac2f3aaf8e414b43050ee046d02218" exitCode=0 Mar 20 15:30:01 crc kubenswrapper[4779]: I0320 15:30:01.728623 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" event={"ID":"d4981969-176f-47c6-b265-df9f5668838b","Type":"ContainerDied","Data":"7e78b296ba2332d79bf0e80128d52c070bac2f3aaf8e414b43050ee046d02218"} Mar 20 15:30:01 crc kubenswrapper[4779]: I0320 15:30:01.728656 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" event={"ID":"d4981969-176f-47c6-b265-df9f5668838b","Type":"ContainerStarted","Data":"bc65fc9de1370218b107faee3fe24aab122d967fa59152b4680e9e43bbbdabae"} Mar 20 15:30:01 crc kubenswrapper[4779]: I0320 15:30:01.730467 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567010-hqddk" event={"ID":"414c34b6-2c97-4039-8863-a2b245fa0316","Type":"ContainerStarted","Data":"9d79a6d85d46adb61480052cafac2876d6396864900a0c9dbb20dd10fb6baa80"} Mar 20 15:30:02 crc kubenswrapper[4779]: I0320 15:30:02.738443 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567010-hqddk" event={"ID":"414c34b6-2c97-4039-8863-a2b245fa0316","Type":"ContainerStarted","Data":"2f6a65df967b5b24cc5fcb2de3253103cb68509b3f98adc4f0bfdad6e99d53f3"} Mar 20 15:30:02 crc kubenswrapper[4779]: I0320 15:30:02.753728 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567010-hqddk" podStartSLOduration=1.271545755 podStartE2EDuration="2.75370993s" podCreationTimestamp="2026-03-20 15:30:00 +0000 UTC" firstStartedPulling="2026-03-20 15:30:00.890614573 +0000 UTC m=+417.853130373" lastFinishedPulling="2026-03-20 15:30:02.372778738 +0000 UTC m=+419.335294548" observedRunningTime="2026-03-20 15:30:02.75136441 +0000 UTC m=+419.713880210" watchObservedRunningTime="2026-03-20 15:30:02.75370993 +0000 UTC m=+419.716225720" Mar 20 15:30:02 crc kubenswrapper[4779]: I0320 15:30:02.960693 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.089068 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzn7r\" (UniqueName: \"kubernetes.io/projected/d4981969-176f-47c6-b265-df9f5668838b-kube-api-access-lzn7r\") pod \"d4981969-176f-47c6-b265-df9f5668838b\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.089228 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4981969-176f-47c6-b265-df9f5668838b-config-volume\") pod \"d4981969-176f-47c6-b265-df9f5668838b\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.089264 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4981969-176f-47c6-b265-df9f5668838b-secret-volume\") pod \"d4981969-176f-47c6-b265-df9f5668838b\" (UID: \"d4981969-176f-47c6-b265-df9f5668838b\") " Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.089974 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4981969-176f-47c6-b265-df9f5668838b-config-volume" (OuterVolumeSpecName: "config-volume") pod "d4981969-176f-47c6-b265-df9f5668838b" (UID: "d4981969-176f-47c6-b265-df9f5668838b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.094593 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4981969-176f-47c6-b265-df9f5668838b-kube-api-access-lzn7r" (OuterVolumeSpecName: "kube-api-access-lzn7r") pod "d4981969-176f-47c6-b265-df9f5668838b" (UID: "d4981969-176f-47c6-b265-df9f5668838b"). InnerVolumeSpecName "kube-api-access-lzn7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.094780 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4981969-176f-47c6-b265-df9f5668838b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d4981969-176f-47c6-b265-df9f5668838b" (UID: "d4981969-176f-47c6-b265-df9f5668838b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.191128 4779 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4981969-176f-47c6-b265-df9f5668838b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.191165 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzn7r\" (UniqueName: \"kubernetes.io/projected/d4981969-176f-47c6-b265-df9f5668838b-kube-api-access-lzn7r\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.191174 4779 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4981969-176f-47c6-b265-df9f5668838b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.743098 4779 generic.go:334] "Generic (PLEG): container finished" podID="414c34b6-2c97-4039-8863-a2b245fa0316" containerID="2f6a65df967b5b24cc5fcb2de3253103cb68509b3f98adc4f0bfdad6e99d53f3" exitCode=0 Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.743185 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567010-hqddk" event={"ID":"414c34b6-2c97-4039-8863-a2b245fa0316","Type":"ContainerDied","Data":"2f6a65df967b5b24cc5fcb2de3253103cb68509b3f98adc4f0bfdad6e99d53f3"} Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.744824 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" event={"ID":"d4981969-176f-47c6-b265-df9f5668838b","Type":"ContainerDied","Data":"bc65fc9de1370218b107faee3fe24aab122d967fa59152b4680e9e43bbbdabae"} Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.744858 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc65fc9de1370218b107faee3fe24aab122d967fa59152b4680e9e43bbbdabae" Mar 20 15:30:03 crc kubenswrapper[4779]: I0320 15:30:03.744905 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw" Mar 20 15:30:06 crc kubenswrapper[4779]: I0320 15:30:04.985573 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567010-hqddk" Mar 20 15:30:06 crc kubenswrapper[4779]: I0320 15:30:05.114167 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4lmk\" (UniqueName: \"kubernetes.io/projected/414c34b6-2c97-4039-8863-a2b245fa0316-kube-api-access-r4lmk\") pod \"414c34b6-2c97-4039-8863-a2b245fa0316\" (UID: \"414c34b6-2c97-4039-8863-a2b245fa0316\") " Mar 20 15:30:06 crc kubenswrapper[4779]: I0320 15:30:05.119471 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414c34b6-2c97-4039-8863-a2b245fa0316-kube-api-access-r4lmk" (OuterVolumeSpecName: "kube-api-access-r4lmk") pod "414c34b6-2c97-4039-8863-a2b245fa0316" (UID: "414c34b6-2c97-4039-8863-a2b245fa0316"). InnerVolumeSpecName "kube-api-access-r4lmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:06 crc kubenswrapper[4779]: I0320 15:30:05.215880 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4lmk\" (UniqueName: \"kubernetes.io/projected/414c34b6-2c97-4039-8863-a2b245fa0316-kube-api-access-r4lmk\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:06 crc kubenswrapper[4779]: I0320 15:30:05.756170 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567010-hqddk" event={"ID":"414c34b6-2c97-4039-8863-a2b245fa0316","Type":"ContainerDied","Data":"9d79a6d85d46adb61480052cafac2876d6396864900a0c9dbb20dd10fb6baa80"} Mar 20 15:30:06 crc kubenswrapper[4779]: I0320 15:30:05.756216 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d79a6d85d46adb61480052cafac2876d6396864900a0c9dbb20dd10fb6baa80" Mar 20 15:30:06 crc kubenswrapper[4779]: I0320 15:30:05.756247 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567010-hqddk" Mar 20 15:30:18 crc kubenswrapper[4779]: I0320 15:30:18.988703 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhgrq"] Mar 20 15:30:18 crc kubenswrapper[4779]: I0320 15:30:18.991608 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mhgrq" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" containerName="registry-server" containerID="cri-o://bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8" gracePeriod=30 Mar 20 15:30:18 crc kubenswrapper[4779]: I0320 15:30:18.993426 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qtlz2"] Mar 20 15:30:18 crc kubenswrapper[4779]: I0320 15:30:18.993694 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qtlz2" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" containerName="registry-server" containerID="cri-o://4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b" gracePeriod=30 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.011589 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkwpf"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.011854 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" podUID="b458c9c5-3878-42e1-995b-713f56d36b25" containerName="marketplace-operator" containerID="cri-o://66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca" gracePeriod=30 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.023857 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fphd"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.024142 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4fphd" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerName="registry-server" containerID="cri-o://29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d" gracePeriod=30 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.028459 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9wb77"] Mar 20 15:30:19 crc kubenswrapper[4779]: E0320 15:30:19.028737 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414c34b6-2c97-4039-8863-a2b245fa0316" containerName="oc" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.028754 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="414c34b6-2c97-4039-8863-a2b245fa0316" containerName="oc" Mar 20 15:30:19 crc kubenswrapper[4779]: E0320 15:30:19.028761 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4981969-176f-47c6-b265-df9f5668838b" containerName="collect-profiles" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.028767 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4981969-176f-47c6-b265-df9f5668838b" containerName="collect-profiles" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.028852 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="414c34b6-2c97-4039-8863-a2b245fa0316" containerName="oc" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.028866 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4981969-176f-47c6-b265-df9f5668838b" containerName="collect-profiles" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.029273 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.032196 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tqlmj"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.032485 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tqlmj" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerName="registry-server" containerID="cri-o://9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4" gracePeriod=30 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.043853 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9wb77"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.083474 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk7md\" (UniqueName: \"kubernetes.io/projected/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-kube-api-access-tk7md\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.083597 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.083652 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.184420 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk7md\" (UniqueName: \"kubernetes.io/projected/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-kube-api-access-tk7md\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.184613 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.184723 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.193944 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.195371 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.206040 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk7md\" (UniqueName: \"kubernetes.io/projected/7231ee4d-4e95-48bd-8f2a-be66eccaf6a1-kube-api-access-tk7md\") pod \"marketplace-operator-79b997595-9wb77\" (UID: \"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.412713 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.414629 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.422514 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.450005 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.462161 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.474534 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488548 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhsch\" (UniqueName: \"kubernetes.io/projected/57ad9993-5b81-4487-8f70-37e41aca1678-kube-api-access-zhsch\") pod \"57ad9993-5b81-4487-8f70-37e41aca1678\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488621 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-utilities\") pod \"f20db228-a34a-4734-ae22-53cd86de06ed\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488660 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-utilities\") pod \"57ad9993-5b81-4487-8f70-37e41aca1678\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488687 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-catalog-content\") pod \"57ad9993-5b81-4487-8f70-37e41aca1678\" (UID: \"57ad9993-5b81-4487-8f70-37e41aca1678\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488737 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-trusted-ca\") pod \"b458c9c5-3878-42e1-995b-713f56d36b25\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488755 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-operator-metrics\") pod \"b458c9c5-3878-42e1-995b-713f56d36b25\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488782 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-utilities\") pod \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488836 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s892x\" (UniqueName: \"kubernetes.io/projected/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-kube-api-access-s892x\") pod \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488854 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-catalog-content\") pod \"f20db228-a34a-4734-ae22-53cd86de06ed\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488921 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hchf8\" (UniqueName: \"kubernetes.io/projected/b458c9c5-3878-42e1-995b-713f56d36b25-kube-api-access-hchf8\") pod \"b458c9c5-3878-42e1-995b-713f56d36b25\" (UID: \"b458c9c5-3878-42e1-995b-713f56d36b25\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488939 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qfpp\" (UniqueName: \"kubernetes.io/projected/f20db228-a34a-4734-ae22-53cd86de06ed-kube-api-access-7qfpp\") pod \"f20db228-a34a-4734-ae22-53cd86de06ed\" (UID: \"f20db228-a34a-4734-ae22-53cd86de06ed\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.488976 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-catalog-content\") pod \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\" (UID: \"23f0a6ab-6ded-4310-82ff-98ea45fa3a43\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.491171 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-utilities" (OuterVolumeSpecName: "utilities") pod "57ad9993-5b81-4487-8f70-37e41aca1678" (UID: "57ad9993-5b81-4487-8f70-37e41aca1678"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.500381 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20db228-a34a-4734-ae22-53cd86de06ed-kube-api-access-7qfpp" (OuterVolumeSpecName: "kube-api-access-7qfpp") pod "f20db228-a34a-4734-ae22-53cd86de06ed" (UID: "f20db228-a34a-4734-ae22-53cd86de06ed"). InnerVolumeSpecName "kube-api-access-7qfpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.500436 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-kube-api-access-s892x" (OuterVolumeSpecName: "kube-api-access-s892x") pod "23f0a6ab-6ded-4310-82ff-98ea45fa3a43" (UID: "23f0a6ab-6ded-4310-82ff-98ea45fa3a43"). InnerVolumeSpecName "kube-api-access-s892x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.500456 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ad9993-5b81-4487-8f70-37e41aca1678-kube-api-access-zhsch" (OuterVolumeSpecName: "kube-api-access-zhsch") pod "57ad9993-5b81-4487-8f70-37e41aca1678" (UID: "57ad9993-5b81-4487-8f70-37e41aca1678"). InnerVolumeSpecName "kube-api-access-zhsch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.506461 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-utilities" (OuterVolumeSpecName: "utilities") pod "f20db228-a34a-4734-ae22-53cd86de06ed" (UID: "f20db228-a34a-4734-ae22-53cd86de06ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.506459 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b458c9c5-3878-42e1-995b-713f56d36b25" (UID: "b458c9c5-3878-42e1-995b-713f56d36b25"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.507768 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-utilities" (OuterVolumeSpecName: "utilities") pod "23f0a6ab-6ded-4310-82ff-98ea45fa3a43" (UID: "23f0a6ab-6ded-4310-82ff-98ea45fa3a43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.512522 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b458c9c5-3878-42e1-995b-713f56d36b25" (UID: "b458c9c5-3878-42e1-995b-713f56d36b25"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.534203 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23f0a6ab-6ded-4310-82ff-98ea45fa3a43" (UID: "23f0a6ab-6ded-4310-82ff-98ea45fa3a43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.562390 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b458c9c5-3878-42e1-995b-713f56d36b25-kube-api-access-hchf8" (OuterVolumeSpecName: "kube-api-access-hchf8") pod "b458c9c5-3878-42e1-995b-713f56d36b25" (UID: "b458c9c5-3878-42e1-995b-713f56d36b25"). InnerVolumeSpecName "kube-api-access-hchf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.574800 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57ad9993-5b81-4487-8f70-37e41aca1678" (UID: "57ad9993-5b81-4487-8f70-37e41aca1678"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.588902 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f20db228-a34a-4734-ae22-53cd86de06ed" (UID: "f20db228-a34a-4734-ae22-53cd86de06ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593183 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-catalog-content\") pod \"437c3121-8b7a-48b1-b805-540a41e89b6a\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593234 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlqhf\" (UniqueName: \"kubernetes.io/projected/437c3121-8b7a-48b1-b805-540a41e89b6a-kube-api-access-jlqhf\") pod \"437c3121-8b7a-48b1-b805-540a41e89b6a\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593257 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-utilities\") pod \"437c3121-8b7a-48b1-b805-540a41e89b6a\" (UID: \"437c3121-8b7a-48b1-b805-540a41e89b6a\") " Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593514 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593531 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhsch\" (UniqueName: \"kubernetes.io/projected/57ad9993-5b81-4487-8f70-37e41aca1678-kube-api-access-zhsch\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593542 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593550 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593558 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ad9993-5b81-4487-8f70-37e41aca1678-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593565 4779 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593574 4779 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b458c9c5-3878-42e1-995b-713f56d36b25-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593583 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593592 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s892x\" (UniqueName: \"kubernetes.io/projected/23f0a6ab-6ded-4310-82ff-98ea45fa3a43-kube-api-access-s892x\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593601 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20db228-a34a-4734-ae22-53cd86de06ed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593609 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hchf8\" (UniqueName: \"kubernetes.io/projected/b458c9c5-3878-42e1-995b-713f56d36b25-kube-api-access-hchf8\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.593617 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qfpp\" (UniqueName: \"kubernetes.io/projected/f20db228-a34a-4734-ae22-53cd86de06ed-kube-api-access-7qfpp\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.594351 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-utilities" (OuterVolumeSpecName: "utilities") pod "437c3121-8b7a-48b1-b805-540a41e89b6a" (UID: "437c3121-8b7a-48b1-b805-540a41e89b6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.595979 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437c3121-8b7a-48b1-b805-540a41e89b6a-kube-api-access-jlqhf" (OuterVolumeSpecName: "kube-api-access-jlqhf") pod "437c3121-8b7a-48b1-b805-540a41e89b6a" (UID: "437c3121-8b7a-48b1-b805-540a41e89b6a"). InnerVolumeSpecName "kube-api-access-jlqhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.646858 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9wb77"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.694293 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlqhf\" (UniqueName: \"kubernetes.io/projected/437c3121-8b7a-48b1-b805-540a41e89b6a-kube-api-access-jlqhf\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.694333 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.730983 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "437c3121-8b7a-48b1-b805-540a41e89b6a" (UID: "437c3121-8b7a-48b1-b805-540a41e89b6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.795470 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437c3121-8b7a-48b1-b805-540a41e89b6a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.836480 4779 generic.go:334] "Generic (PLEG): container finished" podID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerID="9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4" exitCode=0 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.836539 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqlmj" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.836558 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqlmj" event={"ID":"437c3121-8b7a-48b1-b805-540a41e89b6a","Type":"ContainerDied","Data":"9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.837119 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqlmj" event={"ID":"437c3121-8b7a-48b1-b805-540a41e89b6a","Type":"ContainerDied","Data":"5f5d53b386641d115a5646c5216a4aa9076519ffef47dec0b3690788b1312e77"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.837167 4779 scope.go:117] "RemoveContainer" containerID="9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.839412 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" event={"ID":"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1","Type":"ContainerStarted","Data":"da654393077c340c04d5eeb539c8ab67b30c39fca09dfc58db841d6ae408d94d"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.839448 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" event={"ID":"7231ee4d-4e95-48bd-8f2a-be66eccaf6a1","Type":"ContainerStarted","Data":"1aa9d59e5ba37754bd79ae531608b819d292033f784139d097076a2897cde73e"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.841452 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.842738 4779 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9wb77 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.73:8080/healthz\": dial tcp 10.217.0.73:8080: connect: connection refused" start-of-body= Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.842840 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" podUID="7231ee4d-4e95-48bd-8f2a-be66eccaf6a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.73:8080/healthz\": dial tcp 10.217.0.73:8080: connect: connection refused" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.846917 4779 generic.go:334] "Generic (PLEG): container finished" podID="b458c9c5-3878-42e1-995b-713f56d36b25" containerID="66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca" exitCode=0 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.847127 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" event={"ID":"b458c9c5-3878-42e1-995b-713f56d36b25","Type":"ContainerDied","Data":"66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.847164 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" event={"ID":"b458c9c5-3878-42e1-995b-713f56d36b25","Type":"ContainerDied","Data":"bc800091d5b4f91c56b3c3976690879cbbe3c1470ec61046392b065384467807"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.847223 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qkwpf" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.853228 4779 generic.go:334] "Generic (PLEG): container finished" podID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerID="29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d" exitCode=0 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.853299 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fphd" event={"ID":"23f0a6ab-6ded-4310-82ff-98ea45fa3a43","Type":"ContainerDied","Data":"29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.853324 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fphd" event={"ID":"23f0a6ab-6ded-4310-82ff-98ea45fa3a43","Type":"ContainerDied","Data":"89b0cef7d5f67595008027bdd285a4ef0304697e00e955ca1b4ea04e08a67f28"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.853381 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fphd" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.860642 4779 generic.go:334] "Generic (PLEG): container finished" podID="57ad9993-5b81-4487-8f70-37e41aca1678" containerID="bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8" exitCode=0 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.860803 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhgrq" event={"ID":"57ad9993-5b81-4487-8f70-37e41aca1678","Type":"ContainerDied","Data":"bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.860838 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhgrq" event={"ID":"57ad9993-5b81-4487-8f70-37e41aca1678","Type":"ContainerDied","Data":"8bb1200bac34434d2da5a2465ff8408513d7170e026189d72e473f6b410a4c5d"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.860877 4779 scope.go:117] "RemoveContainer" containerID="ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.861802 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhgrq" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.870381 4779 generic.go:334] "Generic (PLEG): container finished" podID="f20db228-a34a-4734-ae22-53cd86de06ed" containerID="4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b" exitCode=0 Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.870473 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtlz2" event={"ID":"f20db228-a34a-4734-ae22-53cd86de06ed","Type":"ContainerDied","Data":"4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.870500 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtlz2" event={"ID":"f20db228-a34a-4734-ae22-53cd86de06ed","Type":"ContainerDied","Data":"dc4ff41d4082d44bab2c4bcbb918f9d70361aa6e9326164e67e076c7be0f07a4"} Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.870554 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtlz2" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.884395 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" podStartSLOduration=0.884376526 podStartE2EDuration="884.376526ms" podCreationTimestamp="2026-03-20 15:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:30:19.864413359 +0000 UTC m=+436.826929179" watchObservedRunningTime="2026-03-20 15:30:19.884376526 +0000 UTC m=+436.846892326" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.899450 4779 scope.go:117] "RemoveContainer" containerID="857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.905139 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkwpf"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.918533 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkwpf"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.923701 4779 scope.go:117] "RemoveContainer" containerID="9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4" Mar 20 15:30:19 crc kubenswrapper[4779]: E0320 15:30:19.924147 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4\": container with ID starting with 9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4 not found: ID does not exist" containerID="9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.924174 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4"} err="failed to get container status \"9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4\": rpc error: code = NotFound desc = could not find container \"9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4\": container with ID starting with 9997352d2e68b034ad49e40cb699e2b25de024715fcda1345da55e7f1264c6c4 not found: ID does not exist" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.924222 4779 scope.go:117] "RemoveContainer" containerID="ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca" Mar 20 15:30:19 crc kubenswrapper[4779]: E0320 15:30:19.924486 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca\": container with ID starting with ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca not found: ID does not exist" containerID="ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.924534 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca"} err="failed to get container status \"ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca\": rpc error: code = NotFound desc = could not find container \"ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca\": container with ID starting with ce8a8cdbee3ec8bdd3afa257d2e95960b9efa892a5a3747351a1f054c8a041ca not found: ID does not exist" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.924562 4779 scope.go:117] "RemoveContainer" containerID="857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6" Mar 20 15:30:19 crc kubenswrapper[4779]: E0320 15:30:19.925213 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6\": container with ID starting with 857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6 not found: ID does not exist" containerID="857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.925241 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6"} err="failed to get container status \"857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6\": rpc error: code = NotFound desc = could not find container \"857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6\": container with ID starting with 857e7a7c0ea0f9c505163802912d203e03b38152573ab21887b690f58b81dff6 not found: ID does not exist" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.925256 4779 scope.go:117] "RemoveContainer" containerID="66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.926964 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fphd"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.934997 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fphd"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.938563 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tqlmj"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.941623 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tqlmj"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.944586 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhgrq"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.944800 4779 scope.go:117] "RemoveContainer" containerID="66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca" Mar 20 15:30:19 crc kubenswrapper[4779]: E0320 15:30:19.945471 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca\": container with ID starting with 66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca not found: ID does not exist" containerID="66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.945502 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca"} err="failed to get container status \"66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca\": rpc error: code = NotFound desc = could not find container \"66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca\": container with ID starting with 66d01ed73fc5911ed1903d997f9271e89041f1a39ab7e2ab5a8997ae6e091bca not found: ID does not exist" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.945527 4779 scope.go:117] "RemoveContainer" containerID="29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.948143 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mhgrq"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.951378 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qtlz2"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.959487 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qtlz2"] Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.961676 4779 scope.go:117] "RemoveContainer" containerID="369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d" Mar 20 15:30:19 crc kubenswrapper[4779]: I0320 15:30:19.985361 4779 scope.go:117] "RemoveContainer" containerID="89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.002454 4779 scope.go:117] "RemoveContainer" containerID="29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.004068 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d\": container with ID starting with 29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d not found: ID does not exist" containerID="29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.004137 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d"} err="failed to get container status \"29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d\": rpc error: code = NotFound desc = could not find container \"29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d\": container with ID starting with 29c7eb606996f1f12664c7b3fa6b1a4244ae2de9775927b7ab019a27c02e9c5d not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.004171 4779 scope.go:117] "RemoveContainer" containerID="369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.004685 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d\": container with ID starting with 369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d not found: ID does not exist" containerID="369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.004729 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d"} err="failed to get container status \"369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d\": rpc error: code = NotFound desc = could not find container \"369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d\": container with ID starting with 369cbc5469819bab07d373b2755027f35e52e9c0aaaf4710cad9bb52dcde306d not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.004745 4779 scope.go:117] "RemoveContainer" containerID="89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.005195 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06\": container with ID starting with 89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06 not found: ID does not exist" containerID="89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.005220 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06"} err="failed to get container status \"89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06\": rpc error: code = NotFound desc = could not find container \"89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06\": container with ID starting with 89020eebdeec3186af339fd2e72ea11348dc3b7f02ed93e64be9c22b245bcf06 not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.005236 4779 scope.go:117] "RemoveContainer" containerID="bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.026496 4779 scope.go:117] "RemoveContainer" containerID="e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.040414 4779 scope.go:117] "RemoveContainer" containerID="c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.053312 4779 scope.go:117] "RemoveContainer" containerID="bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.054523 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8\": container with ID starting with bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8 not found: ID does not exist" containerID="bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.054611 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8"} err="failed to get container status \"bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8\": rpc error: code = NotFound desc = could not find container \"bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8\": container with ID starting with bb3314fca7615109ddbc4c31d1a8def8d062194d0d7b6e9041b410613d457de8 not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.054639 4779 scope.go:117] "RemoveContainer" containerID="e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.054909 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c\": container with ID starting with e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c not found: ID does not exist" containerID="e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.054932 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c"} err="failed to get container status \"e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c\": rpc error: code = NotFound desc = could not find container \"e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c\": container with ID starting with e2983b5c23a5652cb569f910fdab97a10f99d11758854148926f0a2e2762781c not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.054960 4779 scope.go:117] "RemoveContainer" containerID="c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.055227 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b\": container with ID starting with c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b not found: ID does not exist" containerID="c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.055261 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b"} err="failed to get container status \"c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b\": rpc error: code = NotFound desc = could not find container \"c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b\": container with ID starting with c71c15c973609dfbbfa4a511d4d7ff71120b8c0e63ef06049f1bc629f4d0ab4b not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.055283 4779 scope.go:117] "RemoveContainer" containerID="4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.067165 4779 scope.go:117] "RemoveContainer" containerID="ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.081420 4779 scope.go:117] "RemoveContainer" containerID="a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.093850 4779 scope.go:117] "RemoveContainer" containerID="4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.094273 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b\": container with ID starting with 4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b not found: ID does not exist" containerID="4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.094309 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b"} err="failed to get container status \"4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b\": rpc error: code = NotFound desc = could not find container \"4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b\": container with ID starting with 4d54ca13eb8028b8751952af0a994524eecf91147f1b70001e1c7bc8f805a25b not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.094332 4779 scope.go:117] "RemoveContainer" containerID="ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.095036 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521\": container with ID starting with ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521 not found: ID does not exist" containerID="ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.095066 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521"} err="failed to get container status \"ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521\": rpc error: code = NotFound desc = could not find container \"ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521\": container with ID starting with ec5628f1fc79845ad6d4e028b0424bb230020610c56d9f1fdc132ea3f506e521 not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.095087 4779 scope.go:117] "RemoveContainer" containerID="a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce" Mar 20 15:30:20 crc kubenswrapper[4779]: E0320 15:30:20.095403 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce\": container with ID starting with a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce not found: ID does not exist" containerID="a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.095434 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce"} err="failed to get container status \"a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce\": rpc error: code = NotFound desc = could not find container \"a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce\": container with ID starting with a27a9bf1aaf89b9bcffb19170b25c1f81b75a0dbf842d2da720c188a5afaf9ce not found: ID does not exist" Mar 20 15:30:20 crc kubenswrapper[4779]: I0320 15:30:20.884694 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9wb77" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.200754 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-df7rk"] Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201013 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerName="extract-utilities" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201028 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerName="extract-utilities" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201038 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b458c9c5-3878-42e1-995b-713f56d36b25" containerName="marketplace-operator" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201045 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b458c9c5-3878-42e1-995b-713f56d36b25" containerName="marketplace-operator" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201057 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" containerName="extract-utilities" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201068 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" containerName="extract-utilities" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201079 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" containerName="extract-utilities" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201088 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" containerName="extract-utilities" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201097 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201107 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201139 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerName="extract-utilities" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201146 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerName="extract-utilities" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201156 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerName="extract-content" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201162 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerName="extract-content" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201175 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" containerName="extract-content" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201182 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" containerName="extract-content" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201197 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerName="extract-content" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201203 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerName="extract-content" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201214 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201222 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201230 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201238 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201248 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" containerName="extract-content" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201255 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" containerName="extract-content" Mar 20 15:30:21 crc kubenswrapper[4779]: E0320 15:30:21.201267 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201275 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201393 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201412 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b458c9c5-3878-42e1-995b-713f56d36b25" containerName="marketplace-operator" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201422 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201432 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.201441 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" containerName="registry-server" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.202246 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.203913 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.212962 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-df7rk"] Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.316352 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d3b26c1-8c12-4764-a415-0e7db617947b-utilities\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.316393 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d3b26c1-8c12-4764-a415-0e7db617947b-catalog-content\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.316414 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gtsc\" (UniqueName: \"kubernetes.io/projected/9d3b26c1-8c12-4764-a415-0e7db617947b-kube-api-access-2gtsc\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.403963 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cj6mh"] Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.405491 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.409905 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.410465 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cj6mh"] Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.417590 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d3b26c1-8c12-4764-a415-0e7db617947b-utilities\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.417631 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d3b26c1-8c12-4764-a415-0e7db617947b-catalog-content\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.417649 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gtsc\" (UniqueName: \"kubernetes.io/projected/9d3b26c1-8c12-4764-a415-0e7db617947b-kube-api-access-2gtsc\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.418066 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d3b26c1-8c12-4764-a415-0e7db617947b-utilities\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.418146 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d3b26c1-8c12-4764-a415-0e7db617947b-catalog-content\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.439092 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gtsc\" (UniqueName: \"kubernetes.io/projected/9d3b26c1-8c12-4764-a415-0e7db617947b-kube-api-access-2gtsc\") pod \"community-operators-df7rk\" (UID: \"9d3b26c1-8c12-4764-a415-0e7db617947b\") " pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.518239 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd35186-bfd4-47da-b12b-8073d45e78f7-utilities\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.518286 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz9sm\" (UniqueName: \"kubernetes.io/projected/ecd35186-bfd4-47da-b12b-8073d45e78f7-kube-api-access-jz9sm\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.518368 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd35186-bfd4-47da-b12b-8073d45e78f7-catalog-content\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.519084 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.619354 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd35186-bfd4-47da-b12b-8073d45e78f7-catalog-content\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.619424 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd35186-bfd4-47da-b12b-8073d45e78f7-utilities\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.619446 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz9sm\" (UniqueName: \"kubernetes.io/projected/ecd35186-bfd4-47da-b12b-8073d45e78f7-kube-api-access-jz9sm\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.619977 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd35186-bfd4-47da-b12b-8073d45e78f7-catalog-content\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.619976 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd35186-bfd4-47da-b12b-8073d45e78f7-utilities\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.636417 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz9sm\" (UniqueName: \"kubernetes.io/projected/ecd35186-bfd4-47da-b12b-8073d45e78f7-kube-api-access-jz9sm\") pod \"certified-operators-cj6mh\" (UID: \"ecd35186-bfd4-47da-b12b-8073d45e78f7\") " pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.697010 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-df7rk"] Mar 20 15:30:21 crc kubenswrapper[4779]: W0320 15:30:21.705668 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d3b26c1_8c12_4764_a415_0e7db617947b.slice/crio-1cafd93b7881925cf398d69841ca86c0e62f1b483fbdb799b17b1ca6cafac0fd WatchSource:0}: Error finding container 1cafd93b7881925cf398d69841ca86c0e62f1b483fbdb799b17b1ca6cafac0fd: Status 404 returned error can't find the container with id 1cafd93b7881925cf398d69841ca86c0e62f1b483fbdb799b17b1ca6cafac0fd Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.720891 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.815340 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f0a6ab-6ded-4310-82ff-98ea45fa3a43" path="/var/lib/kubelet/pods/23f0a6ab-6ded-4310-82ff-98ea45fa3a43/volumes" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.816420 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437c3121-8b7a-48b1-b805-540a41e89b6a" path="/var/lib/kubelet/pods/437c3121-8b7a-48b1-b805-540a41e89b6a/volumes" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.817165 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ad9993-5b81-4487-8f70-37e41aca1678" path="/var/lib/kubelet/pods/57ad9993-5b81-4487-8f70-37e41aca1678/volumes" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.818398 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b458c9c5-3878-42e1-995b-713f56d36b25" path="/var/lib/kubelet/pods/b458c9c5-3878-42e1-995b-713f56d36b25/volumes" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.818935 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20db228-a34a-4734-ae22-53cd86de06ed" path="/var/lib/kubelet/pods/f20db228-a34a-4734-ae22-53cd86de06ed/volumes" Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.899846 4779 generic.go:334] "Generic (PLEG): container finished" podID="9d3b26c1-8c12-4764-a415-0e7db617947b" containerID="3926a975ec6b531fc8794845baf9e5626a50064ed182e31e8745bc9ac80521e2" exitCode=0 Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.902095 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-df7rk" event={"ID":"9d3b26c1-8c12-4764-a415-0e7db617947b","Type":"ContainerDied","Data":"3926a975ec6b531fc8794845baf9e5626a50064ed182e31e8745bc9ac80521e2"} Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.902235 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-df7rk" event={"ID":"9d3b26c1-8c12-4764-a415-0e7db617947b","Type":"ContainerStarted","Data":"1cafd93b7881925cf398d69841ca86c0e62f1b483fbdb799b17b1ca6cafac0fd"} Mar 20 15:30:21 crc kubenswrapper[4779]: I0320 15:30:21.906232 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cj6mh"] Mar 20 15:30:22 crc kubenswrapper[4779]: I0320 15:30:22.905404 4779 generic.go:334] "Generic (PLEG): container finished" podID="ecd35186-bfd4-47da-b12b-8073d45e78f7" containerID="8329f4d3b656da86960f1d7211fa2b993e8da9faa2bfba18b10cb2a8804092ad" exitCode=0 Mar 20 15:30:22 crc kubenswrapper[4779]: I0320 15:30:22.905452 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj6mh" event={"ID":"ecd35186-bfd4-47da-b12b-8073d45e78f7","Type":"ContainerDied","Data":"8329f4d3b656da86960f1d7211fa2b993e8da9faa2bfba18b10cb2a8804092ad"} Mar 20 15:30:22 crc kubenswrapper[4779]: I0320 15:30:22.905504 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj6mh" event={"ID":"ecd35186-bfd4-47da-b12b-8073d45e78f7","Type":"ContainerStarted","Data":"2d7022892ef7520200b388ae68b649629a27b70b5e9f89e8757b23c28fbb8901"} Mar 20 15:30:22 crc kubenswrapper[4779]: I0320 15:30:22.908935 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-df7rk" event={"ID":"9d3b26c1-8c12-4764-a415-0e7db617947b","Type":"ContainerStarted","Data":"13d8d4738d348d3469af195a5f8b3d2d93bdd2846dc4b75cfbe529a1aa599408"} Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.602929 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m8pg5"] Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.607146 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.608579 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8pg5"] Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.612077 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.645686 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d8d242-c472-4367-9462-1d55253e19b0-utilities\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.645743 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59b9w\" (UniqueName: \"kubernetes.io/projected/11d8d242-c472-4367-9462-1d55253e19b0-kube-api-access-59b9w\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.645765 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d8d242-c472-4367-9462-1d55253e19b0-catalog-content\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.746453 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d8d242-c472-4367-9462-1d55253e19b0-utilities\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.746787 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59b9w\" (UniqueName: \"kubernetes.io/projected/11d8d242-c472-4367-9462-1d55253e19b0-kube-api-access-59b9w\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.746815 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d8d242-c472-4367-9462-1d55253e19b0-catalog-content\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.747091 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11d8d242-c472-4367-9462-1d55253e19b0-utilities\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.747195 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11d8d242-c472-4367-9462-1d55253e19b0-catalog-content\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.770027 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59b9w\" (UniqueName: \"kubernetes.io/projected/11d8d242-c472-4367-9462-1d55253e19b0-kube-api-access-59b9w\") pod \"redhat-marketplace-m8pg5\" (UID: \"11d8d242-c472-4367-9462-1d55253e19b0\") " pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.798687 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dmv6d"] Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.799657 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.801707 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.814097 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmv6d"] Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.856891 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwnw6\" (UniqueName: \"kubernetes.io/projected/41c24f7d-d859-4eb5-b775-5893df11cf84-kube-api-access-lwnw6\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.857013 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c24f7d-d859-4eb5-b775-5893df11cf84-catalog-content\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.857136 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c24f7d-d859-4eb5-b775-5893df11cf84-utilities\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.915811 4779 generic.go:334] "Generic (PLEG): container finished" podID="9d3b26c1-8c12-4764-a415-0e7db617947b" containerID="13d8d4738d348d3469af195a5f8b3d2d93bdd2846dc4b75cfbe529a1aa599408" exitCode=0 Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.915877 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-df7rk" event={"ID":"9d3b26c1-8c12-4764-a415-0e7db617947b","Type":"ContainerDied","Data":"13d8d4738d348d3469af195a5f8b3d2d93bdd2846dc4b75cfbe529a1aa599408"} Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.920223 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj6mh" event={"ID":"ecd35186-bfd4-47da-b12b-8073d45e78f7","Type":"ContainerStarted","Data":"4d6145f4c95bede6e48c56f994fa0aab5cbe6dd48ce90cbc8d90df9e328ac593"} Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.927601 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.959797 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c24f7d-d859-4eb5-b775-5893df11cf84-catalog-content\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.959852 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c24f7d-d859-4eb5-b775-5893df11cf84-utilities\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.959929 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwnw6\" (UniqueName: \"kubernetes.io/projected/41c24f7d-d859-4eb5-b775-5893df11cf84-kube-api-access-lwnw6\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.960754 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c24f7d-d859-4eb5-b775-5893df11cf84-utilities\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.961051 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c24f7d-d859-4eb5-b775-5893df11cf84-catalog-content\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:23 crc kubenswrapper[4779]: I0320 15:30:23.977477 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwnw6\" (UniqueName: \"kubernetes.io/projected/41c24f7d-d859-4eb5-b775-5893df11cf84-kube-api-access-lwnw6\") pod \"redhat-operators-dmv6d\" (UID: \"41c24f7d-d859-4eb5-b775-5893df11cf84\") " pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.127648 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.289439 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmv6d"] Mar 20 15:30:24 crc kubenswrapper[4779]: W0320 15:30:24.297638 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c24f7d_d859_4eb5_b775_5893df11cf84.slice/crio-27ad64a7498d911630dacabf703f5388d2a47e0079d94127604e63d2ae998417 WatchSource:0}: Error finding container 27ad64a7498d911630dacabf703f5388d2a47e0079d94127604e63d2ae998417: Status 404 returned error can't find the container with id 27ad64a7498d911630dacabf703f5388d2a47e0079d94127604e63d2ae998417 Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.316781 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8pg5"] Mar 20 15:30:24 crc kubenswrapper[4779]: W0320 15:30:24.328291 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d8d242_c472_4367_9462_1d55253e19b0.slice/crio-20f0a4108180e636cea691c9eea8d2d87499c4737d5cfb489ae532eea3d1be6a WatchSource:0}: Error finding container 20f0a4108180e636cea691c9eea8d2d87499c4737d5cfb489ae532eea3d1be6a: Status 404 returned error can't find the container with id 20f0a4108180e636cea691c9eea8d2d87499c4737d5cfb489ae532eea3d1be6a Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.929515 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-df7rk" event={"ID":"9d3b26c1-8c12-4764-a415-0e7db617947b","Type":"ContainerStarted","Data":"78f6dcf826849745c4a984663c28ebb2c78579c92ad3d99446ad219fee045f86"} Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.931999 4779 generic.go:334] "Generic (PLEG): container finished" podID="11d8d242-c472-4367-9462-1d55253e19b0" containerID="ac5deecc000e63e50d7b94650e9a915b1888a069fd4b06092c49c99ffc659c65" exitCode=0 Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.932094 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8pg5" event={"ID":"11d8d242-c472-4367-9462-1d55253e19b0","Type":"ContainerDied","Data":"ac5deecc000e63e50d7b94650e9a915b1888a069fd4b06092c49c99ffc659c65"} Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.932221 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8pg5" event={"ID":"11d8d242-c472-4367-9462-1d55253e19b0","Type":"ContainerStarted","Data":"20f0a4108180e636cea691c9eea8d2d87499c4737d5cfb489ae532eea3d1be6a"} Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.934574 4779 generic.go:334] "Generic (PLEG): container finished" podID="41c24f7d-d859-4eb5-b775-5893df11cf84" containerID="319101740cbc14b24079538d7deae4cf7bcea46cdc06a902a7d550d29e3f08a9" exitCode=0 Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.935368 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmv6d" event={"ID":"41c24f7d-d859-4eb5-b775-5893df11cf84","Type":"ContainerDied","Data":"319101740cbc14b24079538d7deae4cf7bcea46cdc06a902a7d550d29e3f08a9"} Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.935404 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmv6d" event={"ID":"41c24f7d-d859-4eb5-b775-5893df11cf84","Type":"ContainerStarted","Data":"27ad64a7498d911630dacabf703f5388d2a47e0079d94127604e63d2ae998417"} Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.938725 4779 generic.go:334] "Generic (PLEG): container finished" podID="ecd35186-bfd4-47da-b12b-8073d45e78f7" containerID="4d6145f4c95bede6e48c56f994fa0aab5cbe6dd48ce90cbc8d90df9e328ac593" exitCode=0 Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.938776 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj6mh" event={"ID":"ecd35186-bfd4-47da-b12b-8073d45e78f7","Type":"ContainerDied","Data":"4d6145f4c95bede6e48c56f994fa0aab5cbe6dd48ce90cbc8d90df9e328ac593"} Mar 20 15:30:24 crc kubenswrapper[4779]: I0320 15:30:24.970540 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-df7rk" podStartSLOduration=1.25875398 podStartE2EDuration="3.970520481s" podCreationTimestamp="2026-03-20 15:30:21 +0000 UTC" firstStartedPulling="2026-03-20 15:30:21.904583257 +0000 UTC m=+438.867099057" lastFinishedPulling="2026-03-20 15:30:24.616349758 +0000 UTC m=+441.578865558" observedRunningTime="2026-03-20 15:30:24.94839991 +0000 UTC m=+441.910915710" watchObservedRunningTime="2026-03-20 15:30:24.970520481 +0000 UTC m=+441.933036281" Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.149890 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.149951 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.814009 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" podUID="424710af-f3c1-4cd2-9072-ae3cd248895d" containerName="registry" containerID="cri-o://dafe4701414127f20cd6d3454dbfafc466f734dff061dc8770414601ad5d2380" gracePeriod=30 Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.945373 4779 generic.go:334] "Generic (PLEG): container finished" podID="11d8d242-c472-4367-9462-1d55253e19b0" containerID="2248ff1e79071fbfae94752b69fd876eb269e7bf6f18020a7ce1dfc2ce2587c2" exitCode=0 Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.945615 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8pg5" event={"ID":"11d8d242-c472-4367-9462-1d55253e19b0","Type":"ContainerDied","Data":"2248ff1e79071fbfae94752b69fd876eb269e7bf6f18020a7ce1dfc2ce2587c2"} Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.948991 4779 generic.go:334] "Generic (PLEG): container finished" podID="424710af-f3c1-4cd2-9072-ae3cd248895d" containerID="dafe4701414127f20cd6d3454dbfafc466f734dff061dc8770414601ad5d2380" exitCode=0 Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.949048 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" event={"ID":"424710af-f3c1-4cd2-9072-ae3cd248895d","Type":"ContainerDied","Data":"dafe4701414127f20cd6d3454dbfafc466f734dff061dc8770414601ad5d2380"} Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.969638 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj6mh" event={"ID":"ecd35186-bfd4-47da-b12b-8073d45e78f7","Type":"ContainerStarted","Data":"d1b0751020932ca6d8458bd824785d63d880951cd010f84475e4263b26511cd3"} Mar 20 15:30:25 crc kubenswrapper[4779]: I0320 15:30:25.992373 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cj6mh" podStartSLOduration=2.513700951 podStartE2EDuration="4.992354429s" podCreationTimestamp="2026-03-20 15:30:21 +0000 UTC" firstStartedPulling="2026-03-20 15:30:22.908402298 +0000 UTC m=+439.870918098" lastFinishedPulling="2026-03-20 15:30:25.387055776 +0000 UTC m=+442.349571576" observedRunningTime="2026-03-20 15:30:25.989358444 +0000 UTC m=+442.951874244" watchObservedRunningTime="2026-03-20 15:30:25.992354429 +0000 UTC m=+442.954870229" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.215714 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.286605 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"424710af-f3c1-4cd2-9072-ae3cd248895d\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.286673 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/424710af-f3c1-4cd2-9072-ae3cd248895d-ca-trust-extracted\") pod \"424710af-f3c1-4cd2-9072-ae3cd248895d\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.286695 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-bound-sa-token\") pod \"424710af-f3c1-4cd2-9072-ae3cd248895d\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.286730 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-certificates\") pod \"424710af-f3c1-4cd2-9072-ae3cd248895d\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.286763 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-tls\") pod \"424710af-f3c1-4cd2-9072-ae3cd248895d\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.286810 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-trusted-ca\") pod \"424710af-f3c1-4cd2-9072-ae3cd248895d\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.286853 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/424710af-f3c1-4cd2-9072-ae3cd248895d-installation-pull-secrets\") pod \"424710af-f3c1-4cd2-9072-ae3cd248895d\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.286894 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn8bx\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-kube-api-access-cn8bx\") pod \"424710af-f3c1-4cd2-9072-ae3cd248895d\" (UID: \"424710af-f3c1-4cd2-9072-ae3cd248895d\") " Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.287567 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "424710af-f3c1-4cd2-9072-ae3cd248895d" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.288014 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "424710af-f3c1-4cd2-9072-ae3cd248895d" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.303960 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-kube-api-access-cn8bx" (OuterVolumeSpecName: "kube-api-access-cn8bx") pod "424710af-f3c1-4cd2-9072-ae3cd248895d" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d"). InnerVolumeSpecName "kube-api-access-cn8bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.303979 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "424710af-f3c1-4cd2-9072-ae3cd248895d" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.304183 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424710af-f3c1-4cd2-9072-ae3cd248895d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "424710af-f3c1-4cd2-9072-ae3cd248895d" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.304248 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "424710af-f3c1-4cd2-9072-ae3cd248895d" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.304379 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "424710af-f3c1-4cd2-9072-ae3cd248895d" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.312322 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424710af-f3c1-4cd2-9072-ae3cd248895d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "424710af-f3c1-4cd2-9072-ae3cd248895d" (UID: "424710af-f3c1-4cd2-9072-ae3cd248895d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.387985 4779 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/424710af-f3c1-4cd2-9072-ae3cd248895d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.388028 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn8bx\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-kube-api-access-cn8bx\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.388041 4779 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/424710af-f3c1-4cd2-9072-ae3cd248895d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.388051 4779 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.388062 4779 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.388074 4779 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/424710af-f3c1-4cd2-9072-ae3cd248895d-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.388084 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/424710af-f3c1-4cd2-9072-ae3cd248895d-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.974490 4779 generic.go:334] "Generic (PLEG): container finished" podID="41c24f7d-d859-4eb5-b775-5893df11cf84" containerID="2a50af375b0d152a956a7fa205d02acfeb7760facaad881891701005dfe58c94" exitCode=0 Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.974546 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmv6d" event={"ID":"41c24f7d-d859-4eb5-b775-5893df11cf84","Type":"ContainerDied","Data":"2a50af375b0d152a956a7fa205d02acfeb7760facaad881891701005dfe58c94"} Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.976411 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" event={"ID":"424710af-f3c1-4cd2-9072-ae3cd248895d","Type":"ContainerDied","Data":"46f510fc45ec9fc2f4ca355d768a7029239ec5d4b898f93ac21562deb186ca33"} Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.976445 4779 scope.go:117] "RemoveContainer" containerID="dafe4701414127f20cd6d3454dbfafc466f734dff061dc8770414601ad5d2380" Mar 20 15:30:26 crc kubenswrapper[4779]: I0320 15:30:26.976414 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tx4fd" Mar 20 15:30:27 crc kubenswrapper[4779]: I0320 15:30:27.016201 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tx4fd"] Mar 20 15:30:27 crc kubenswrapper[4779]: I0320 15:30:27.027849 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tx4fd"] Mar 20 15:30:27 crc kubenswrapper[4779]: I0320 15:30:27.820592 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="424710af-f3c1-4cd2-9072-ae3cd248895d" path="/var/lib/kubelet/pods/424710af-f3c1-4cd2-9072-ae3cd248895d/volumes" Mar 20 15:30:27 crc kubenswrapper[4779]: I0320 15:30:27.984677 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8pg5" event={"ID":"11d8d242-c472-4367-9462-1d55253e19b0","Type":"ContainerStarted","Data":"a5d571ef8db35399b58354c94af47971d60f9eb891abf0b626d8c3efc6947d89"} Mar 20 15:30:28 crc kubenswrapper[4779]: I0320 15:30:28.009429 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m8pg5" podStartSLOduration=2.948286742 podStartE2EDuration="5.00939644s" podCreationTimestamp="2026-03-20 15:30:23 +0000 UTC" firstStartedPulling="2026-03-20 15:30:24.933305098 +0000 UTC m=+441.895820898" lastFinishedPulling="2026-03-20 15:30:26.994414796 +0000 UTC m=+443.956930596" observedRunningTime="2026-03-20 15:30:28.005234325 +0000 UTC m=+444.967750155" watchObservedRunningTime="2026-03-20 15:30:28.00939644 +0000 UTC m=+444.971912240" Mar 20 15:30:28 crc kubenswrapper[4779]: I0320 15:30:28.992417 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmv6d" event={"ID":"41c24f7d-d859-4eb5-b775-5893df11cf84","Type":"ContainerStarted","Data":"e7ee58f5c3526dd98915636c9d13a76039c0fc23e015b4d1a5437e6b8ca6d645"} Mar 20 15:30:29 crc kubenswrapper[4779]: I0320 15:30:29.010716 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dmv6d" podStartSLOduration=2.182720964 podStartE2EDuration="6.010697247s" podCreationTimestamp="2026-03-20 15:30:23 +0000 UTC" firstStartedPulling="2026-03-20 15:30:24.937038772 +0000 UTC m=+441.899554562" lastFinishedPulling="2026-03-20 15:30:28.765015045 +0000 UTC m=+445.727530845" observedRunningTime="2026-03-20 15:30:29.008651325 +0000 UTC m=+445.971167125" watchObservedRunningTime="2026-03-20 15:30:29.010697247 +0000 UTC m=+445.973213047" Mar 20 15:30:31 crc kubenswrapper[4779]: I0320 15:30:31.519360 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:31 crc kubenswrapper[4779]: I0320 15:30:31.519699 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:31 crc kubenswrapper[4779]: I0320 15:30:31.557296 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:31 crc kubenswrapper[4779]: I0320 15:30:31.721164 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:31 crc kubenswrapper[4779]: I0320 15:30:31.721227 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:31 crc kubenswrapper[4779]: I0320 15:30:31.759840 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:32 crc kubenswrapper[4779]: I0320 15:30:32.044630 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-df7rk" Mar 20 15:30:32 crc kubenswrapper[4779]: I0320 15:30:32.050992 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cj6mh" Mar 20 15:30:33 crc kubenswrapper[4779]: I0320 15:30:33.928663 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:33 crc kubenswrapper[4779]: I0320 15:30:33.928723 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:33 crc kubenswrapper[4779]: I0320 15:30:33.969524 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:34 crc kubenswrapper[4779]: I0320 15:30:34.062685 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m8pg5" Mar 20 15:30:34 crc kubenswrapper[4779]: I0320 15:30:34.130516 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:34 crc kubenswrapper[4779]: I0320 15:30:34.130567 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:35 crc kubenswrapper[4779]: I0320 15:30:35.181657 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dmv6d" podUID="41c24f7d-d859-4eb5-b775-5893df11cf84" containerName="registry-server" probeResult="failure" output=< Mar 20 15:30:35 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 15:30:35 crc kubenswrapper[4779]: > Mar 20 15:30:44 crc kubenswrapper[4779]: I0320 15:30:44.167616 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:44 crc kubenswrapper[4779]: I0320 15:30:44.207792 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dmv6d" Mar 20 15:30:55 crc kubenswrapper[4779]: I0320 15:30:55.149816 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:30:55 crc kubenswrapper[4779]: I0320 15:30:55.150435 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:31:25 crc kubenswrapper[4779]: I0320 15:31:25.151198 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:31:25 crc kubenswrapper[4779]: I0320 15:31:25.152401 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:31:25 crc kubenswrapper[4779]: I0320 15:31:25.152495 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:31:25 crc kubenswrapper[4779]: I0320 15:31:25.153829 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36c8a2a60794cd5ada94d0dde0b6b6073b2ecf6d648323c7aec9dfc0a0b52837"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:31:25 crc kubenswrapper[4779]: I0320 15:31:25.153978 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://36c8a2a60794cd5ada94d0dde0b6b6073b2ecf6d648323c7aec9dfc0a0b52837" gracePeriod=600 Mar 20 15:31:25 crc kubenswrapper[4779]: I0320 15:31:25.282183 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="36c8a2a60794cd5ada94d0dde0b6b6073b2ecf6d648323c7aec9dfc0a0b52837" exitCode=0 Mar 20 15:31:25 crc kubenswrapper[4779]: I0320 15:31:25.282227 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"36c8a2a60794cd5ada94d0dde0b6b6073b2ecf6d648323c7aec9dfc0a0b52837"} Mar 20 15:31:25 crc kubenswrapper[4779]: I0320 15:31:25.282260 4779 scope.go:117] "RemoveContainer" containerID="498dce321546a5d876986ec92f47d53655ef83f600c0498878bbe473c526f01f" Mar 20 15:31:26 crc kubenswrapper[4779]: I0320 15:31:26.288947 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"4ac21e339673c7860531784a8ee71a61f14bdd346b0fb848a3ba1fb383a92288"} Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.134777 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567012-hh498"] Mar 20 15:32:00 crc kubenswrapper[4779]: E0320 15:32:00.135499 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424710af-f3c1-4cd2-9072-ae3cd248895d" containerName="registry" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.135515 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="424710af-f3c1-4cd2-9072-ae3cd248895d" containerName="registry" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.135641 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="424710af-f3c1-4cd2-9072-ae3cd248895d" containerName="registry" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.136058 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567012-hh498" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.141836 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.141917 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.141935 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.143210 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567012-hh498"] Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.273206 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlcr\" (UniqueName: \"kubernetes.io/projected/6fb42fe5-65a0-4287-a631-c15e961ce57c-kube-api-access-7rlcr\") pod \"auto-csr-approver-29567012-hh498\" (UID: \"6fb42fe5-65a0-4287-a631-c15e961ce57c\") " pod="openshift-infra/auto-csr-approver-29567012-hh498" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.374946 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlcr\" (UniqueName: \"kubernetes.io/projected/6fb42fe5-65a0-4287-a631-c15e961ce57c-kube-api-access-7rlcr\") pod \"auto-csr-approver-29567012-hh498\" (UID: \"6fb42fe5-65a0-4287-a631-c15e961ce57c\") " pod="openshift-infra/auto-csr-approver-29567012-hh498" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.393077 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlcr\" (UniqueName: \"kubernetes.io/projected/6fb42fe5-65a0-4287-a631-c15e961ce57c-kube-api-access-7rlcr\") pod \"auto-csr-approver-29567012-hh498\" (UID: \"6fb42fe5-65a0-4287-a631-c15e961ce57c\") " pod="openshift-infra/auto-csr-approver-29567012-hh498" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.451598 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567012-hh498" Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.832703 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567012-hh498"] Mar 20 15:32:00 crc kubenswrapper[4779]: I0320 15:32:00.845969 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:32:01 crc kubenswrapper[4779]: I0320 15:32:01.456064 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567012-hh498" event={"ID":"6fb42fe5-65a0-4287-a631-c15e961ce57c","Type":"ContainerStarted","Data":"9ac4f84cd98c4c9f5e5a9c5698ec1308c711f9b989eeb7ca3bd04784811c1e38"} Mar 20 15:32:02 crc kubenswrapper[4779]: I0320 15:32:02.461928 4779 generic.go:334] "Generic (PLEG): container finished" podID="6fb42fe5-65a0-4287-a631-c15e961ce57c" containerID="668be50bc3945e74c969f7c7f93df91c9773291d395ea440647525a2bbc791b2" exitCode=0 Mar 20 15:32:02 crc kubenswrapper[4779]: I0320 15:32:02.462006 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567012-hh498" event={"ID":"6fb42fe5-65a0-4287-a631-c15e961ce57c","Type":"ContainerDied","Data":"668be50bc3945e74c969f7c7f93df91c9773291d395ea440647525a2bbc791b2"} Mar 20 15:32:03 crc kubenswrapper[4779]: I0320 15:32:03.654856 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567012-hh498" Mar 20 15:32:03 crc kubenswrapper[4779]: I0320 15:32:03.816646 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rlcr\" (UniqueName: \"kubernetes.io/projected/6fb42fe5-65a0-4287-a631-c15e961ce57c-kube-api-access-7rlcr\") pod \"6fb42fe5-65a0-4287-a631-c15e961ce57c\" (UID: \"6fb42fe5-65a0-4287-a631-c15e961ce57c\") " Mar 20 15:32:03 crc kubenswrapper[4779]: I0320 15:32:03.823378 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb42fe5-65a0-4287-a631-c15e961ce57c-kube-api-access-7rlcr" (OuterVolumeSpecName: "kube-api-access-7rlcr") pod "6fb42fe5-65a0-4287-a631-c15e961ce57c" (UID: "6fb42fe5-65a0-4287-a631-c15e961ce57c"). InnerVolumeSpecName "kube-api-access-7rlcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:32:03 crc kubenswrapper[4779]: I0320 15:32:03.917772 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rlcr\" (UniqueName: \"kubernetes.io/projected/6fb42fe5-65a0-4287-a631-c15e961ce57c-kube-api-access-7rlcr\") on node \"crc\" DevicePath \"\"" Mar 20 15:32:04 crc kubenswrapper[4779]: I0320 15:32:04.482634 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567012-hh498" event={"ID":"6fb42fe5-65a0-4287-a631-c15e961ce57c","Type":"ContainerDied","Data":"9ac4f84cd98c4c9f5e5a9c5698ec1308c711f9b989eeb7ca3bd04784811c1e38"} Mar 20 15:32:04 crc kubenswrapper[4779]: I0320 15:32:04.482943 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac4f84cd98c4c9f5e5a9c5698ec1308c711f9b989eeb7ca3bd04784811c1e38" Mar 20 15:32:04 crc kubenswrapper[4779]: I0320 15:32:04.482703 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567012-hh498" Mar 20 15:32:04 crc kubenswrapper[4779]: I0320 15:32:04.705803 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567006-jlkgl"] Mar 20 15:32:04 crc kubenswrapper[4779]: I0320 15:32:04.708810 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567006-jlkgl"] Mar 20 15:32:05 crc kubenswrapper[4779]: I0320 15:32:05.814877 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3cf60a6-49cd-43e9-982b-4673db42fde6" path="/var/lib/kubelet/pods/c3cf60a6-49cd-43e9-982b-4673db42fde6/volumes" Mar 20 15:33:18 crc kubenswrapper[4779]: I0320 15:33:18.873145 4779 scope.go:117] "RemoveContainer" containerID="c227e950d444ebb9d9effe23ef9b32bb9dd082e5eeff994497f6ec54ff852ffd" Mar 20 15:33:25 crc kubenswrapper[4779]: I0320 15:33:25.150397 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:33:25 crc kubenswrapper[4779]: I0320 15:33:25.150866 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:33:55 crc kubenswrapper[4779]: I0320 15:33:55.149948 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:33:55 crc kubenswrapper[4779]: I0320 15:33:55.150531 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.129353 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567014-8w9sr"] Mar 20 15:34:00 crc kubenswrapper[4779]: E0320 15:34:00.129894 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb42fe5-65a0-4287-a631-c15e961ce57c" containerName="oc" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.129915 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb42fe5-65a0-4287-a631-c15e961ce57c" containerName="oc" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.130011 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb42fe5-65a0-4287-a631-c15e961ce57c" containerName="oc" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.130428 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567014-8w9sr" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.133870 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.134039 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.138081 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.140653 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567014-8w9sr"] Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.186666 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zlh\" (UniqueName: \"kubernetes.io/projected/96fd8882-ac45-4e57-b6fb-c683861c992d-kube-api-access-j4zlh\") pod \"auto-csr-approver-29567014-8w9sr\" (UID: \"96fd8882-ac45-4e57-b6fb-c683861c992d\") " pod="openshift-infra/auto-csr-approver-29567014-8w9sr" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.287851 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zlh\" (UniqueName: \"kubernetes.io/projected/96fd8882-ac45-4e57-b6fb-c683861c992d-kube-api-access-j4zlh\") pod \"auto-csr-approver-29567014-8w9sr\" (UID: \"96fd8882-ac45-4e57-b6fb-c683861c992d\") " pod="openshift-infra/auto-csr-approver-29567014-8w9sr" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.306639 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zlh\" (UniqueName: \"kubernetes.io/projected/96fd8882-ac45-4e57-b6fb-c683861c992d-kube-api-access-j4zlh\") pod \"auto-csr-approver-29567014-8w9sr\" (UID: \"96fd8882-ac45-4e57-b6fb-c683861c992d\") " pod="openshift-infra/auto-csr-approver-29567014-8w9sr" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.448219 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567014-8w9sr" Mar 20 15:34:00 crc kubenswrapper[4779]: I0320 15:34:00.612207 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567014-8w9sr"] Mar 20 15:34:00 crc kubenswrapper[4779]: W0320 15:34:00.620024 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96fd8882_ac45_4e57_b6fb_c683861c992d.slice/crio-540c85dae28e092d109d885a9272fdd06bdc815bbdf98955b29f99ed7e63d4b5 WatchSource:0}: Error finding container 540c85dae28e092d109d885a9272fdd06bdc815bbdf98955b29f99ed7e63d4b5: Status 404 returned error can't find the container with id 540c85dae28e092d109d885a9272fdd06bdc815bbdf98955b29f99ed7e63d4b5 Mar 20 15:34:01 crc kubenswrapper[4779]: I0320 15:34:01.065133 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567014-8w9sr" event={"ID":"96fd8882-ac45-4e57-b6fb-c683861c992d","Type":"ContainerStarted","Data":"540c85dae28e092d109d885a9272fdd06bdc815bbdf98955b29f99ed7e63d4b5"} Mar 20 15:34:03 crc kubenswrapper[4779]: I0320 15:34:03.076836 4779 generic.go:334] "Generic (PLEG): container finished" podID="96fd8882-ac45-4e57-b6fb-c683861c992d" containerID="0ffb20c433de882c373cf689e4f93f0b2efad7f41df5a0d180156d892c1db3be" exitCode=0 Mar 20 15:34:03 crc kubenswrapper[4779]: I0320 15:34:03.077139 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567014-8w9sr" event={"ID":"96fd8882-ac45-4e57-b6fb-c683861c992d","Type":"ContainerDied","Data":"0ffb20c433de882c373cf689e4f93f0b2efad7f41df5a0d180156d892c1db3be"} Mar 20 15:34:04 crc kubenswrapper[4779]: I0320 15:34:04.314326 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567014-8w9sr" Mar 20 15:34:04 crc kubenswrapper[4779]: I0320 15:34:04.436905 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4zlh\" (UniqueName: \"kubernetes.io/projected/96fd8882-ac45-4e57-b6fb-c683861c992d-kube-api-access-j4zlh\") pod \"96fd8882-ac45-4e57-b6fb-c683861c992d\" (UID: \"96fd8882-ac45-4e57-b6fb-c683861c992d\") " Mar 20 15:34:04 crc kubenswrapper[4779]: I0320 15:34:04.441927 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fd8882-ac45-4e57-b6fb-c683861c992d-kube-api-access-j4zlh" (OuterVolumeSpecName: "kube-api-access-j4zlh") pod "96fd8882-ac45-4e57-b6fb-c683861c992d" (UID: "96fd8882-ac45-4e57-b6fb-c683861c992d"). InnerVolumeSpecName "kube-api-access-j4zlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:34:04 crc kubenswrapper[4779]: I0320 15:34:04.538202 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4zlh\" (UniqueName: \"kubernetes.io/projected/96fd8882-ac45-4e57-b6fb-c683861c992d-kube-api-access-j4zlh\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4779]: I0320 15:34:05.087380 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567014-8w9sr" event={"ID":"96fd8882-ac45-4e57-b6fb-c683861c992d","Type":"ContainerDied","Data":"540c85dae28e092d109d885a9272fdd06bdc815bbdf98955b29f99ed7e63d4b5"} Mar 20 15:34:05 crc kubenswrapper[4779]: I0320 15:34:05.087637 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="540c85dae28e092d109d885a9272fdd06bdc815bbdf98955b29f99ed7e63d4b5" Mar 20 15:34:05 crc kubenswrapper[4779]: I0320 15:34:05.087688 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567014-8w9sr" Mar 20 15:34:05 crc kubenswrapper[4779]: I0320 15:34:05.362338 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567008-wtptq"] Mar 20 15:34:05 crc kubenswrapper[4779]: I0320 15:34:05.367617 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567008-wtptq"] Mar 20 15:34:05 crc kubenswrapper[4779]: I0320 15:34:05.816581 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d55c02de-f3fa-44bf-94c7-8879acc79040" path="/var/lib/kubelet/pods/d55c02de-f3fa-44bf-94c7-8879acc79040/volumes" Mar 20 15:34:18 crc kubenswrapper[4779]: I0320 15:34:18.916570 4779 scope.go:117] "RemoveContainer" containerID="195e0938b182a51a2c488b5b59059335c49c7135aeef7732aa7b01c4742f7f5e" Mar 20 15:34:25 crc kubenswrapper[4779]: I0320 15:34:25.150357 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:34:25 crc kubenswrapper[4779]: I0320 15:34:25.151669 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:34:25 crc kubenswrapper[4779]: I0320 15:34:25.151765 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:34:25 crc kubenswrapper[4779]: I0320 15:34:25.152274 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ac21e339673c7860531784a8ee71a61f14bdd346b0fb848a3ba1fb383a92288"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:34:25 crc kubenswrapper[4779]: I0320 15:34:25.152403 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://4ac21e339673c7860531784a8ee71a61f14bdd346b0fb848a3ba1fb383a92288" gracePeriod=600 Mar 20 15:34:26 crc kubenswrapper[4779]: I0320 15:34:26.193320 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="4ac21e339673c7860531784a8ee71a61f14bdd346b0fb848a3ba1fb383a92288" exitCode=0 Mar 20 15:34:26 crc kubenswrapper[4779]: I0320 15:34:26.193381 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"4ac21e339673c7860531784a8ee71a61f14bdd346b0fb848a3ba1fb383a92288"} Mar 20 15:34:26 crc kubenswrapper[4779]: I0320 15:34:26.193784 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"6987ea5d5bc8dacf20b1f5a9f2e4c3b448070d7ac94c29c0b4e57a5ce5ecc6b5"} Mar 20 15:34:26 crc kubenswrapper[4779]: I0320 15:34:26.193806 4779 scope.go:117] "RemoveContainer" containerID="36c8a2a60794cd5ada94d0dde0b6b6073b2ecf6d648323c7aec9dfc0a0b52837" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.132762 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567016-9qvlz"] Mar 20 15:36:00 crc kubenswrapper[4779]: E0320 15:36:00.134920 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fd8882-ac45-4e57-b6fb-c683861c992d" containerName="oc" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.135022 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fd8882-ac45-4e57-b6fb-c683861c992d" containerName="oc" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.135252 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fd8882-ac45-4e57-b6fb-c683861c992d" containerName="oc" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.135701 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.137667 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567016-9qvlz"] Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.138690 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.138864 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.138764 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.196042 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr5n2\" (UniqueName: \"kubernetes.io/projected/20e7966a-75f6-48c2-9003-6fea92b1edb1-kube-api-access-fr5n2\") pod \"auto-csr-approver-29567016-9qvlz\" (UID: \"20e7966a-75f6-48c2-9003-6fea92b1edb1\") " pod="openshift-infra/auto-csr-approver-29567016-9qvlz" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.297148 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr5n2\" (UniqueName: \"kubernetes.io/projected/20e7966a-75f6-48c2-9003-6fea92b1edb1-kube-api-access-fr5n2\") pod \"auto-csr-approver-29567016-9qvlz\" (UID: \"20e7966a-75f6-48c2-9003-6fea92b1edb1\") " pod="openshift-infra/auto-csr-approver-29567016-9qvlz" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.316819 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr5n2\" (UniqueName: \"kubernetes.io/projected/20e7966a-75f6-48c2-9003-6fea92b1edb1-kube-api-access-fr5n2\") pod \"auto-csr-approver-29567016-9qvlz\" (UID: \"20e7966a-75f6-48c2-9003-6fea92b1edb1\") " pod="openshift-infra/auto-csr-approver-29567016-9qvlz" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.455002 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.615054 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567016-9qvlz"] Mar 20 15:36:00 crc kubenswrapper[4779]: I0320 15:36:00.669660 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" event={"ID":"20e7966a-75f6-48c2-9003-6fea92b1edb1","Type":"ContainerStarted","Data":"592baae70f06ebca2605524e32d758528ca5ca9d1d3d48e6789cfc8f6fbe42ef"} Mar 20 15:36:02 crc kubenswrapper[4779]: I0320 15:36:02.683500 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" event={"ID":"20e7966a-75f6-48c2-9003-6fea92b1edb1","Type":"ContainerStarted","Data":"9c8ffde1bab2148cfd49f0273d0c2bf07338fc8c4f43ff0fa376ad594fadd4fd"} Mar 20 15:36:02 crc kubenswrapper[4779]: I0320 15:36:02.697051 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" podStartSLOduration=1.001895011 podStartE2EDuration="2.697032916s" podCreationTimestamp="2026-03-20 15:36:00 +0000 UTC" firstStartedPulling="2026-03-20 15:36:00.621323643 +0000 UTC m=+777.583839443" lastFinishedPulling="2026-03-20 15:36:02.316461548 +0000 UTC m=+779.278977348" observedRunningTime="2026-03-20 15:36:02.694450103 +0000 UTC m=+779.656965903" watchObservedRunningTime="2026-03-20 15:36:02.697032916 +0000 UTC m=+779.659548716" Mar 20 15:36:03 crc kubenswrapper[4779]: I0320 15:36:03.690508 4779 generic.go:334] "Generic (PLEG): container finished" podID="20e7966a-75f6-48c2-9003-6fea92b1edb1" containerID="9c8ffde1bab2148cfd49f0273d0c2bf07338fc8c4f43ff0fa376ad594fadd4fd" exitCode=0 Mar 20 15:36:03 crc kubenswrapper[4779]: I0320 15:36:03.690630 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" event={"ID":"20e7966a-75f6-48c2-9003-6fea92b1edb1","Type":"ContainerDied","Data":"9c8ffde1bab2148cfd49f0273d0c2bf07338fc8c4f43ff0fa376ad594fadd4fd"} Mar 20 15:36:04 crc kubenswrapper[4779]: I0320 15:36:04.905433 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.054429 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr5n2\" (UniqueName: \"kubernetes.io/projected/20e7966a-75f6-48c2-9003-6fea92b1edb1-kube-api-access-fr5n2\") pod \"20e7966a-75f6-48c2-9003-6fea92b1edb1\" (UID: \"20e7966a-75f6-48c2-9003-6fea92b1edb1\") " Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.059351 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e7966a-75f6-48c2-9003-6fea92b1edb1-kube-api-access-fr5n2" (OuterVolumeSpecName: "kube-api-access-fr5n2") pod "20e7966a-75f6-48c2-9003-6fea92b1edb1" (UID: "20e7966a-75f6-48c2-9003-6fea92b1edb1"). InnerVolumeSpecName "kube-api-access-fr5n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.156263 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr5n2\" (UniqueName: \"kubernetes.io/projected/20e7966a-75f6-48c2-9003-6fea92b1edb1-kube-api-access-fr5n2\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.703608 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" event={"ID":"20e7966a-75f6-48c2-9003-6fea92b1edb1","Type":"ContainerDied","Data":"592baae70f06ebca2605524e32d758528ca5ca9d1d3d48e6789cfc8f6fbe42ef"} Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.703648 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="592baae70f06ebca2605524e32d758528ca5ca9d1d3d48e6789cfc8f6fbe42ef" Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.703662 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567016-9qvlz" Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.743508 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567010-hqddk"] Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.746763 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567010-hqddk"] Mar 20 15:36:05 crc kubenswrapper[4779]: I0320 15:36:05.816616 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414c34b6-2c97-4039-8863-a2b245fa0316" path="/var/lib/kubelet/pods/414c34b6-2c97-4039-8863-a2b245fa0316/volumes" Mar 20 15:36:18 crc kubenswrapper[4779]: I0320 15:36:18.974713 4779 scope.go:117] "RemoveContainer" containerID="2f6a65df967b5b24cc5fcb2de3253103cb68509b3f98adc4f0bfdad6e99d53f3" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.490939 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n9m75"] Mar 20 15:36:19 crc kubenswrapper[4779]: E0320 15:36:19.491452 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e7966a-75f6-48c2-9003-6fea92b1edb1" containerName="oc" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.491464 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e7966a-75f6-48c2-9003-6fea92b1edb1" containerName="oc" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.491554 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e7966a-75f6-48c2-9003-6fea92b1edb1" containerName="oc" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.491917 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n9m75" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.493609 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.494221 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.494799 4779 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-rmk5g" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.504742 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-47n9f"] Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.505516 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-47n9f" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.508145 4779 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-94knf" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.509239 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n9m75"] Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.520949 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-47n9f"] Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.532344 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-lzpdk"] Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.533315 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.535252 4779 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qmhtc" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.544040 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-lzpdk"] Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.630728 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqpzn\" (UniqueName: \"kubernetes.io/projected/1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a-kube-api-access-zqpzn\") pod \"cert-manager-cainjector-cf98fcc89-n9m75\" (UID: \"1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n9m75" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.631091 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfr25\" (UniqueName: \"kubernetes.io/projected/7df1c661-b257-49ff-8a2c-faa47c38a23e-kube-api-access-kfr25\") pod \"cert-manager-858654f9db-47n9f\" (UID: \"7df1c661-b257-49ff-8a2c-faa47c38a23e\") " pod="cert-manager/cert-manager-858654f9db-47n9f" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.731847 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqpzn\" (UniqueName: \"kubernetes.io/projected/1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a-kube-api-access-zqpzn\") pod \"cert-manager-cainjector-cf98fcc89-n9m75\" (UID: \"1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n9m75" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.731946 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnl4h\" (UniqueName: \"kubernetes.io/projected/c3ecb063-b3b2-4585-935f-dbe6fd4147ac-kube-api-access-cnl4h\") pod \"cert-manager-webhook-687f57d79b-lzpdk\" (UID: \"c3ecb063-b3b2-4585-935f-dbe6fd4147ac\") " pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.731982 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfr25\" (UniqueName: \"kubernetes.io/projected/7df1c661-b257-49ff-8a2c-faa47c38a23e-kube-api-access-kfr25\") pod \"cert-manager-858654f9db-47n9f\" (UID: \"7df1c661-b257-49ff-8a2c-faa47c38a23e\") " pod="cert-manager/cert-manager-858654f9db-47n9f" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.750845 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqpzn\" (UniqueName: \"kubernetes.io/projected/1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a-kube-api-access-zqpzn\") pod \"cert-manager-cainjector-cf98fcc89-n9m75\" (UID: \"1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n9m75" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.752392 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfr25\" (UniqueName: \"kubernetes.io/projected/7df1c661-b257-49ff-8a2c-faa47c38a23e-kube-api-access-kfr25\") pod \"cert-manager-858654f9db-47n9f\" (UID: \"7df1c661-b257-49ff-8a2c-faa47c38a23e\") " pod="cert-manager/cert-manager-858654f9db-47n9f" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.806225 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n9m75" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.823619 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-47n9f" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.832705 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnl4h\" (UniqueName: \"kubernetes.io/projected/c3ecb063-b3b2-4585-935f-dbe6fd4147ac-kube-api-access-cnl4h\") pod \"cert-manager-webhook-687f57d79b-lzpdk\" (UID: \"c3ecb063-b3b2-4585-935f-dbe6fd4147ac\") " pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" Mar 20 15:36:19 crc kubenswrapper[4779]: I0320 15:36:19.849713 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnl4h\" (UniqueName: \"kubernetes.io/projected/c3ecb063-b3b2-4585-935f-dbe6fd4147ac-kube-api-access-cnl4h\") pod \"cert-manager-webhook-687f57d79b-lzpdk\" (UID: \"c3ecb063-b3b2-4585-935f-dbe6fd4147ac\") " pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" Mar 20 15:36:20 crc kubenswrapper[4779]: I0320 15:36:20.149868 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" Mar 20 15:36:20 crc kubenswrapper[4779]: I0320 15:36:20.200545 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n9m75"] Mar 20 15:36:20 crc kubenswrapper[4779]: I0320 15:36:20.272883 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-47n9f"] Mar 20 15:36:20 crc kubenswrapper[4779]: I0320 15:36:20.328917 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-lzpdk"] Mar 20 15:36:20 crc kubenswrapper[4779]: I0320 15:36:20.775532 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" event={"ID":"c3ecb063-b3b2-4585-935f-dbe6fd4147ac","Type":"ContainerStarted","Data":"92cc2705117e314a5af052594fc8240a1526dbbf460fa208fba619f8869b8cf8"} Mar 20 15:36:20 crc kubenswrapper[4779]: I0320 15:36:20.776790 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-47n9f" event={"ID":"7df1c661-b257-49ff-8a2c-faa47c38a23e","Type":"ContainerStarted","Data":"722925d670312c1696173aa00b39f1e59f1f18c92cf03ce92e00bde614b1f7bf"} Mar 20 15:36:20 crc kubenswrapper[4779]: I0320 15:36:20.777556 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n9m75" event={"ID":"1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a","Type":"ContainerStarted","Data":"dfc929d0faa56968860e84a361a9a551981521252b0bcae42bbbe7bd47167d9a"} Mar 20 15:36:22 crc kubenswrapper[4779]: I0320 15:36:22.788309 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n9m75" event={"ID":"1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a","Type":"ContainerStarted","Data":"a8111bc13c4c4084ee9801738b8f5374842a33017a3bf5c790d17027c0fe5e50"} Mar 20 15:36:22 crc kubenswrapper[4779]: I0320 15:36:22.804370 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n9m75" podStartSLOduration=1.912591086 podStartE2EDuration="3.804353267s" podCreationTimestamp="2026-03-20 15:36:19 +0000 UTC" firstStartedPulling="2026-03-20 15:36:20.212750523 +0000 UTC m=+797.175266323" lastFinishedPulling="2026-03-20 15:36:22.104512704 +0000 UTC m=+799.067028504" observedRunningTime="2026-03-20 15:36:22.803320972 +0000 UTC m=+799.765836782" watchObservedRunningTime="2026-03-20 15:36:22.804353267 +0000 UTC m=+799.766869067" Mar 20 15:36:23 crc kubenswrapper[4779]: I0320 15:36:23.795809 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" event={"ID":"c3ecb063-b3b2-4585-935f-dbe6fd4147ac","Type":"ContainerStarted","Data":"d1114f75f98dba696334a89c0e69dcbc4787375f621befaa7039c8b06b4f95df"} Mar 20 15:36:23 crc kubenswrapper[4779]: I0320 15:36:23.796186 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" Mar 20 15:36:23 crc kubenswrapper[4779]: I0320 15:36:23.798123 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-47n9f" event={"ID":"7df1c661-b257-49ff-8a2c-faa47c38a23e","Type":"ContainerStarted","Data":"1707d237faffa5656b960473dbe4fb507a6c03540f8d0e79971197283415fbb2"} Mar 20 15:36:23 crc kubenswrapper[4779]: I0320 15:36:23.814168 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" podStartSLOduration=1.503872337 podStartE2EDuration="4.814154088s" podCreationTimestamp="2026-03-20 15:36:19 +0000 UTC" firstStartedPulling="2026-03-20 15:36:20.33700466 +0000 UTC m=+797.299520460" lastFinishedPulling="2026-03-20 15:36:23.647286411 +0000 UTC m=+800.609802211" observedRunningTime="2026-03-20 15:36:23.811887984 +0000 UTC m=+800.774403784" watchObservedRunningTime="2026-03-20 15:36:23.814154088 +0000 UTC m=+800.776669888" Mar 20 15:36:23 crc kubenswrapper[4779]: I0320 15:36:23.834181 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-47n9f" podStartSLOduration=1.5175355289999999 podStartE2EDuration="4.834163013s" podCreationTimestamp="2026-03-20 15:36:19 +0000 UTC" firstStartedPulling="2026-03-20 15:36:20.294619365 +0000 UTC m=+797.257135155" lastFinishedPulling="2026-03-20 15:36:23.611246839 +0000 UTC m=+800.573762639" observedRunningTime="2026-03-20 15:36:23.826029036 +0000 UTC m=+800.788544836" watchObservedRunningTime="2026-03-20 15:36:23.834163013 +0000 UTC m=+800.796678803" Mar 20 15:36:25 crc kubenswrapper[4779]: I0320 15:36:25.149621 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:36:25 crc kubenswrapper[4779]: I0320 15:36:25.149679 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.706881 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bl2w2"] Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.708932 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovn-controller" containerID="cri-o://aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681" gracePeriod=30 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.709218 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovn-acl-logging" containerID="cri-o://76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81" gracePeriod=30 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.709056 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="nbdb" containerID="cri-o://df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047" gracePeriod=30 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.709089 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="northd" containerID="cri-o://e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a" gracePeriod=30 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.709176 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kube-rbac-proxy-node" containerID="cri-o://17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82" gracePeriod=30 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.708939 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="sbdb" containerID="cri-o://65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02" gracePeriod=30 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.709374 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5" gracePeriod=30 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.931898 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" containerID="cri-o://6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294" gracePeriod=30 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.935546 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/3.log" Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.940600 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovn-acl-logging/0.log" Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.942209 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovn-controller/0.log" Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.948480 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81" exitCode=143 Mar 20 15:36:29 crc kubenswrapper[4779]: I0320 15:36:29.948528 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.152124 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-lzpdk" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.183281 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/3.log" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.186181 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovn-acl-logging/0.log" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.186635 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovn-controller/0.log" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.187045 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243184 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nxblr"] Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243399 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243413 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243424 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243431 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243442 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovn-acl-logging" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243450 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovn-acl-logging" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243464 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243471 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243479 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="northd" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243486 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="northd" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243500 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="sbdb" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243507 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="sbdb" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243517 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="nbdb" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243523 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="nbdb" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243530 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243538 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243545 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovn-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243551 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovn-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243560 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kube-rbac-proxy-node" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243567 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kube-rbac-proxy-node" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243579 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kubecfg-setup" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243585 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kubecfg-setup" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243693 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243703 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovn-acl-logging" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243711 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovn-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243724 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243730 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243740 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="sbdb" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243747 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243753 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="kube-rbac-proxy-node" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243762 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="nbdb" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243774 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="northd" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.243875 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.243883 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.244006 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.244017 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: E0320 15:36:30.244126 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.244134 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerName="ovnkube-controller" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.245509 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.321673 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-etc-openvswitch\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.321732 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcxgv\" (UniqueName: \"kubernetes.io/projected/f27b5011-2d73-40e1-b508-a10e9c6f19a8-kube-api-access-kcxgv\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.321808 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovn-node-metrics-cert\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322052 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-netns\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322050 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322090 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-config\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322099 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322130 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-var-lib-openvswitch\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322161 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-ovn\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322193 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-node-log\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322218 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322237 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-slash\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322271 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-openvswitch\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322271 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322286 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-node-log" (OuterVolumeSpecName: "node-log") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322294 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-log-socket\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322321 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-log-socket" (OuterVolumeSpecName: "log-socket") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322318 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-slash" (OuterVolumeSpecName: "host-slash") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322341 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322342 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-script-lib\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322380 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-systemd-units\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322421 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-env-overrides\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322454 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-ovn-kubernetes\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322475 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-kubelet\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322517 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-netd\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322531 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322535 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322552 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-systemd\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322564 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322543 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322603 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322672 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-bin\") pod \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\" (UID: \"f27b5011-2d73-40e1-b508-a10e9c6f19a8\") " Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322703 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322671 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322922 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322922 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-ovn\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.322971 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.324381 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.326679 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-log-socket\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.326763 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-slash\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.326831 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-node-log\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.326872 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-etc-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.326963 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-systemd\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327012 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-systemd-units\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327066 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-var-lib-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327123 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-cni-bin\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327168 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327228 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-cni-netd\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327319 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-ovnkube-config\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327387 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-kubelet\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327424 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327482 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-run-netns\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.327527 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-run-ovn-kubernetes\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.335845 4779 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336103 4779 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336124 4779 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336136 4779 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336146 4779 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336159 4779 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336170 4779 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336182 4779 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336195 4779 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336208 4779 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336219 4779 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336229 4779 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336239 4779 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336251 4779 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336261 4779 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336274 4779 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.336285 4779 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.338429 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.339484 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27b5011-2d73-40e1-b508-a10e9c6f19a8-kube-api-access-kcxgv" (OuterVolumeSpecName: "kube-api-access-kcxgv") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "kube-api-access-kcxgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.351655 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f27b5011-2d73-40e1-b508-a10e9c6f19a8" (UID: "f27b5011-2d73-40e1-b508-a10e9c6f19a8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.437861 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-ovn\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.437932 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-log-socket\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.437958 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-slash\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.437988 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-ovnkube-script-lib\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438014 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-node-log\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438011 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-ovn\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438035 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-etc-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438078 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-slash\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438093 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-systemd-units\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438135 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-systemd\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438141 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-log-socket\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438162 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-var-lib-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438177 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-etc-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438188 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-cni-bin\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438210 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-node-log\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438215 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438247 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-systemd\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438247 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lxzh\" (UniqueName: \"kubernetes.io/projected/65827a3c-1c62-4725-a06f-d63d8e53adba-kube-api-access-9lxzh\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438299 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-var-lib-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438299 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65827a3c-1c62-4725-a06f-d63d8e53adba-ovn-node-metrics-cert\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438299 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-systemd-units\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438324 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-cni-bin\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438353 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438401 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-cni-netd\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438438 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-ovnkube-config\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438441 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-cni-netd\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438475 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-kubelet\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438504 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-kubelet\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438513 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438619 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-run-netns\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438562 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-run-openvswitch\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438687 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-env-overrides\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438714 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-run-ovn-kubernetes\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438756 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-run-netns\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438809 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65827a3c-1c62-4725-a06f-d63d8e53adba-host-run-ovn-kubernetes\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438919 4779 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f27b5011-2d73-40e1-b508-a10e9c6f19a8-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438942 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcxgv\" (UniqueName: \"kubernetes.io/projected/f27b5011-2d73-40e1-b508-a10e9c6f19a8-kube-api-access-kcxgv\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.438956 4779 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f27b5011-2d73-40e1-b508-a10e9c6f19a8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.439270 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-ovnkube-config\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.540268 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-env-overrides\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.540389 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-ovnkube-script-lib\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.540444 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65827a3c-1c62-4725-a06f-d63d8e53adba-ovn-node-metrics-cert\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.540465 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lxzh\" (UniqueName: \"kubernetes.io/projected/65827a3c-1c62-4725-a06f-d63d8e53adba-kube-api-access-9lxzh\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.541227 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-ovnkube-script-lib\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.541371 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65827a3c-1c62-4725-a06f-d63d8e53adba-env-overrides\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.544782 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65827a3c-1c62-4725-a06f-d63d8e53adba-ovn-node-metrics-cert\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.566500 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lxzh\" (UniqueName: \"kubernetes.io/projected/65827a3c-1c62-4725-a06f-d63d8e53adba-kube-api-access-9lxzh\") pod \"ovnkube-node-nxblr\" (UID: \"65827a3c-1c62-4725-a06f-d63d8e53adba\") " pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.861868 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.952649 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"86c80806115bf5c46ac87dc9b7b43cc6137719aa89afcaeae25443ca56964c8a"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.954494 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/2.log" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.954845 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/1.log" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.954870 4779 generic.go:334] "Generic (PLEG): container finished" podID="c30ee189-9db1-41af-8a55-29955cbf6712" containerID="73ee7f75298b875e96d17e303cd91ed74454ae5443ef86c092b8d91ef1008c68" exitCode=2 Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.954900 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfj25" event={"ID":"c30ee189-9db1-41af-8a55-29955cbf6712","Type":"ContainerDied","Data":"73ee7f75298b875e96d17e303cd91ed74454ae5443ef86c092b8d91ef1008c68"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.954920 4779 scope.go:117] "RemoveContainer" containerID="d5df3256fc301fb10f7bb94808c1ec30f56d47576b0e31daaf3bcd641773cbcc" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.955305 4779 scope.go:117] "RemoveContainer" containerID="73ee7f75298b875e96d17e303cd91ed74454ae5443ef86c092b8d91ef1008c68" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.957907 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovnkube-controller/3.log" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.960538 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovn-acl-logging/0.log" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961062 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2w2_f27b5011-2d73-40e1-b508-a10e9c6f19a8/ovn-controller/0.log" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961518 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294" exitCode=0 Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961545 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02" exitCode=0 Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961553 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047" exitCode=0 Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961560 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a" exitCode=0 Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961567 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5" exitCode=0 Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961575 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82" exitCode=0 Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961582 4779 generic.go:334] "Generic (PLEG): container finished" podID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" containerID="aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681" exitCode=143 Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961605 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961735 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961926 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961956 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961969 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961981 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.961995 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962007 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962018 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962025 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962032 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962038 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962044 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962050 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962056 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962063 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962069 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962078 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962089 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962097 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962103 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962135 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962142 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962148 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962156 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962162 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962168 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962175 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962185 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2w2" event={"ID":"f27b5011-2d73-40e1-b508-a10e9c6f19a8","Type":"ContainerDied","Data":"9ecf818626b26da45dfa904b2853098e14aa4610f03a9bbe9b5b49a1f31fbedf"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962196 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962204 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962210 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962217 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962223 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962230 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962236 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962243 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962249 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.962256 4779 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874"} Mar 20 15:36:30 crc kubenswrapper[4779]: I0320 15:36:30.987805 4779 scope.go:117] "RemoveContainer" containerID="6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.000559 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bl2w2"] Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.004576 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bl2w2"] Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.018570 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.090363 4779 scope.go:117] "RemoveContainer" containerID="65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.125848 4779 scope.go:117] "RemoveContainer" containerID="df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.142482 4779 scope.go:117] "RemoveContainer" containerID="e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.155275 4779 scope.go:117] "RemoveContainer" containerID="e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.166913 4779 scope.go:117] "RemoveContainer" containerID="17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.177410 4779 scope.go:117] "RemoveContainer" containerID="76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.191544 4779 scope.go:117] "RemoveContainer" containerID="aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.204436 4779 scope.go:117] "RemoveContainer" containerID="ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.220211 4779 scope.go:117] "RemoveContainer" containerID="6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.225293 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": container with ID starting with 6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294 not found: ID does not exist" containerID="6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.225347 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} err="failed to get container status \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": rpc error: code = NotFound desc = could not find container \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": container with ID starting with 6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.225370 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.226157 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": container with ID starting with fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad not found: ID does not exist" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.226182 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} err="failed to get container status \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": rpc error: code = NotFound desc = could not find container \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": container with ID starting with fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.226204 4779 scope.go:117] "RemoveContainer" containerID="65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.226926 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": container with ID starting with 65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02 not found: ID does not exist" containerID="65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.226950 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} err="failed to get container status \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": rpc error: code = NotFound desc = could not find container \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": container with ID starting with 65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.226966 4779 scope.go:117] "RemoveContainer" containerID="df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.227213 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": container with ID starting with df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047 not found: ID does not exist" containerID="df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.227234 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} err="failed to get container status \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": rpc error: code = NotFound desc = could not find container \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": container with ID starting with df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.227249 4779 scope.go:117] "RemoveContainer" containerID="e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.228045 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": container with ID starting with e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a not found: ID does not exist" containerID="e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.228070 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} err="failed to get container status \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": rpc error: code = NotFound desc = could not find container \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": container with ID starting with e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.228089 4779 scope.go:117] "RemoveContainer" containerID="e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.228596 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": container with ID starting with e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5 not found: ID does not exist" containerID="e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.228618 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} err="failed to get container status \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": rpc error: code = NotFound desc = could not find container \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": container with ID starting with e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.228633 4779 scope.go:117] "RemoveContainer" containerID="17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.228860 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": container with ID starting with 17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82 not found: ID does not exist" containerID="17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.228877 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} err="failed to get container status \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": rpc error: code = NotFound desc = could not find container \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": container with ID starting with 17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.228889 4779 scope.go:117] "RemoveContainer" containerID="76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.229279 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": container with ID starting with 76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81 not found: ID does not exist" containerID="76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.229307 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} err="failed to get container status \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": rpc error: code = NotFound desc = could not find container \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": container with ID starting with 76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.229320 4779 scope.go:117] "RemoveContainer" containerID="aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.230196 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": container with ID starting with aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681 not found: ID does not exist" containerID="aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.230216 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} err="failed to get container status \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": rpc error: code = NotFound desc = could not find container \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": container with ID starting with aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.230229 4779 scope.go:117] "RemoveContainer" containerID="ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874" Mar 20 15:36:31 crc kubenswrapper[4779]: E0320 15:36:31.230437 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": container with ID starting with ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874 not found: ID does not exist" containerID="ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.230452 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874"} err="failed to get container status \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": rpc error: code = NotFound desc = could not find container \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": container with ID starting with ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.230462 4779 scope.go:117] "RemoveContainer" containerID="6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.230673 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} err="failed to get container status \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": rpc error: code = NotFound desc = could not find container \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": container with ID starting with 6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.230684 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.230863 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} err="failed to get container status \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": rpc error: code = NotFound desc = could not find container \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": container with ID starting with fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.230875 4779 scope.go:117] "RemoveContainer" containerID="65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231037 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} err="failed to get container status \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": rpc error: code = NotFound desc = could not find container \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": container with ID starting with 65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231047 4779 scope.go:117] "RemoveContainer" containerID="df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231305 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} err="failed to get container status \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": rpc error: code = NotFound desc = could not find container \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": container with ID starting with df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231318 4779 scope.go:117] "RemoveContainer" containerID="e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231496 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} err="failed to get container status \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": rpc error: code = NotFound desc = could not find container \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": container with ID starting with e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231510 4779 scope.go:117] "RemoveContainer" containerID="e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231680 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} err="failed to get container status \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": rpc error: code = NotFound desc = could not find container \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": container with ID starting with e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231694 4779 scope.go:117] "RemoveContainer" containerID="17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231847 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} err="failed to get container status \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": rpc error: code = NotFound desc = could not find container \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": container with ID starting with 17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.231862 4779 scope.go:117] "RemoveContainer" containerID="76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232009 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} err="failed to get container status \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": rpc error: code = NotFound desc = could not find container \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": container with ID starting with 76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232021 4779 scope.go:117] "RemoveContainer" containerID="aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232276 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} err="failed to get container status \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": rpc error: code = NotFound desc = could not find container \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": container with ID starting with aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232290 4779 scope.go:117] "RemoveContainer" containerID="ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232523 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874"} err="failed to get container status \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": rpc error: code = NotFound desc = could not find container \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": container with ID starting with ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232540 4779 scope.go:117] "RemoveContainer" containerID="6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232741 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} err="failed to get container status \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": rpc error: code = NotFound desc = could not find container \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": container with ID starting with 6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232759 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.232991 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} err="failed to get container status \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": rpc error: code = NotFound desc = could not find container \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": container with ID starting with fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.233014 4779 scope.go:117] "RemoveContainer" containerID="65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.233226 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} err="failed to get container status \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": rpc error: code = NotFound desc = could not find container \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": container with ID starting with 65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.233238 4779 scope.go:117] "RemoveContainer" containerID="df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.233436 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} err="failed to get container status \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": rpc error: code = NotFound desc = could not find container \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": container with ID starting with df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.233452 4779 scope.go:117] "RemoveContainer" containerID="e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.233867 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} err="failed to get container status \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": rpc error: code = NotFound desc = could not find container \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": container with ID starting with e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.233886 4779 scope.go:117] "RemoveContainer" containerID="e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.234079 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} err="failed to get container status \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": rpc error: code = NotFound desc = could not find container \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": container with ID starting with e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.234096 4779 scope.go:117] "RemoveContainer" containerID="17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.234297 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} err="failed to get container status \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": rpc error: code = NotFound desc = could not find container \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": container with ID starting with 17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.234312 4779 scope.go:117] "RemoveContainer" containerID="76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.234501 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} err="failed to get container status \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": rpc error: code = NotFound desc = could not find container \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": container with ID starting with 76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.234516 4779 scope.go:117] "RemoveContainer" containerID="aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.234841 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} err="failed to get container status \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": rpc error: code = NotFound desc = could not find container \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": container with ID starting with aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.234857 4779 scope.go:117] "RemoveContainer" containerID="ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.235087 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874"} err="failed to get container status \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": rpc error: code = NotFound desc = could not find container \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": container with ID starting with ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.235116 4779 scope.go:117] "RemoveContainer" containerID="6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.235324 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294"} err="failed to get container status \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": rpc error: code = NotFound desc = could not find container \"6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294\": container with ID starting with 6c27b8f8180da0ad71249cbfa8966eefae97a7942f033b993572f2c66eaa6294 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.235343 4779 scope.go:117] "RemoveContainer" containerID="fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.235577 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad"} err="failed to get container status \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": rpc error: code = NotFound desc = could not find container \"fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad\": container with ID starting with fe276e6bd8ab98aa47a3e5c7c1018bd7ea665e2c540ab1a8c18c8c73bdf634ad not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.235618 4779 scope.go:117] "RemoveContainer" containerID="65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.235884 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02"} err="failed to get container status \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": rpc error: code = NotFound desc = could not find container \"65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02\": container with ID starting with 65ce103bec725128c028a3df9da4f242371e3069c280f19121fa02cef2ed4c02 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.235899 4779 scope.go:117] "RemoveContainer" containerID="df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.236083 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047"} err="failed to get container status \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": rpc error: code = NotFound desc = could not find container \"df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047\": container with ID starting with df02263b94de76f91d86383c9f80998f5d56b7e327d72473dabec7e9fcefa047 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.236097 4779 scope.go:117] "RemoveContainer" containerID="e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.236311 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a"} err="failed to get container status \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": rpc error: code = NotFound desc = could not find container \"e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a\": container with ID starting with e0f0013aacabb9da65deb9ac984a3dee61e91841b65162d3b72b4cf020d8aa1a not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.236333 4779 scope.go:117] "RemoveContainer" containerID="e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.236622 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5"} err="failed to get container status \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": rpc error: code = NotFound desc = could not find container \"e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5\": container with ID starting with e657e9f5f31c09176d1b2102550a79979ce5e7229fd9470ca3a46888eb7cebf5 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.236639 4779 scope.go:117] "RemoveContainer" containerID="17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.236832 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82"} err="failed to get container status \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": rpc error: code = NotFound desc = could not find container \"17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82\": container with ID starting with 17f095381ab298691b08cc4d983217281eeeb74826449a28174a841a4dc8aa82 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.236848 4779 scope.go:117] "RemoveContainer" containerID="76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.237209 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81"} err="failed to get container status \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": rpc error: code = NotFound desc = could not find container \"76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81\": container with ID starting with 76f58fa48bcd13a5d08af93310b006935e8f1e244c8c6c5a8a18bf40c6166b81 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.237226 4779 scope.go:117] "RemoveContainer" containerID="aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.237486 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681"} err="failed to get container status \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": rpc error: code = NotFound desc = could not find container \"aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681\": container with ID starting with aa2ff5716b2d0b364bbb76d34f0e08d780286d9843f5d1dd60cca7c20012e681 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.237499 4779 scope.go:117] "RemoveContainer" containerID="ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.237684 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874"} err="failed to get container status \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": rpc error: code = NotFound desc = could not find container \"ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874\": container with ID starting with ba12c6fb5d3f94b41ffe109f3a80378245a6cac2b60aab89f88fdaeaeb6f3874 not found: ID does not exist" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.814612 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27b5011-2d73-40e1-b508-a10e9c6f19a8" path="/var/lib/kubelet/pods/f27b5011-2d73-40e1-b508-a10e9c6f19a8/volumes" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.968195 4779 generic.go:334] "Generic (PLEG): container finished" podID="65827a3c-1c62-4725-a06f-d63d8e53adba" containerID="7efe137b8b8118c7512ba263367d835cc052ced38a80a8eaca4598b7f93663ee" exitCode=0 Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.968292 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerDied","Data":"7efe137b8b8118c7512ba263367d835cc052ced38a80a8eaca4598b7f93663ee"} Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.971505 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfj25_c30ee189-9db1-41af-8a55-29955cbf6712/kube-multus/2.log" Mar 20 15:36:31 crc kubenswrapper[4779]: I0320 15:36:31.971595 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfj25" event={"ID":"c30ee189-9db1-41af-8a55-29955cbf6712","Type":"ContainerStarted","Data":"7d586112001a8752ebe2ddf5b40f7069398622f35af4e9f9d47d0cf0a96a11c7"} Mar 20 15:36:32 crc kubenswrapper[4779]: I0320 15:36:32.980846 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"1c235c31944ddba256b28b26f9ce62e67cdf2dd3eee08376bedfae790612b8b8"} Mar 20 15:36:32 crc kubenswrapper[4779]: I0320 15:36:32.981164 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"11216d1d3b410b4c953427910e51ed7ec40b08ad276e0e0135af745bd3a9e9ac"} Mar 20 15:36:32 crc kubenswrapper[4779]: I0320 15:36:32.981178 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"aa85b83dd841c3ff9e8b358f9a6a52c6d169bb2c1d60ff14997238cc6864052c"} Mar 20 15:36:32 crc kubenswrapper[4779]: I0320 15:36:32.981190 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"32794b91972138ed265dbc868140eb11a605ac1de2452ce263535f9d29d3c559"} Mar 20 15:36:32 crc kubenswrapper[4779]: I0320 15:36:32.981200 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"d9ec9ef73fb54902b543dae02d0698b6abc51cc9595e547a3edcfdee619998c3"} Mar 20 15:36:32 crc kubenswrapper[4779]: I0320 15:36:32.981210 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"d10883b3c7cb12baa7f2a0d9dcc984bfde31b28e88280f2249efaf795ce69be5"} Mar 20 15:36:34 crc kubenswrapper[4779]: I0320 15:36:34.995462 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"25fe3cf2b608d937c444c61f4d52edfe1cf71f96b565ec39d7e09a6abe3e741c"} Mar 20 15:36:37 crc kubenswrapper[4779]: I0320 15:36:37.007827 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" event={"ID":"65827a3c-1c62-4725-a06f-d63d8e53adba","Type":"ContainerStarted","Data":"a917fd6157e0f47b62e889ae9b34c8c75932c2558bc92af3662632bae9335a51"} Mar 20 15:36:37 crc kubenswrapper[4779]: I0320 15:36:37.008241 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:37 crc kubenswrapper[4779]: I0320 15:36:37.008840 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:37 crc kubenswrapper[4779]: I0320 15:36:37.008856 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:37 crc kubenswrapper[4779]: I0320 15:36:37.032868 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" podStartSLOduration=7.032853362 podStartE2EDuration="7.032853362s" podCreationTimestamp="2026-03-20 15:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:36:37.032085524 +0000 UTC m=+813.994601364" watchObservedRunningTime="2026-03-20 15:36:37.032853362 +0000 UTC m=+813.995369152" Mar 20 15:36:37 crc kubenswrapper[4779]: I0320 15:36:37.036748 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:37 crc kubenswrapper[4779]: I0320 15:36:37.037966 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:36:53 crc kubenswrapper[4779]: I0320 15:36:53.699923 4779 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 15:36:55 crc kubenswrapper[4779]: I0320 15:36:55.150101 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:36:55 crc kubenswrapper[4779]: I0320 15:36:55.150523 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.150423 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5"] Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.151608 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.153616 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.160306 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5"] Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.256199 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.256367 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd44l\" (UniqueName: \"kubernetes.io/projected/11a586de-ae83-4a62-92c7-51c09b869b36-kube-api-access-zd44l\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.256393 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.357883 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd44l\" (UniqueName: \"kubernetes.io/projected/11a586de-ae83-4a62-92c7-51c09b869b36-kube-api-access-zd44l\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.357932 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.357973 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.358523 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.358589 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.377984 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd44l\" (UniqueName: \"kubernetes.io/projected/11a586de-ae83-4a62-92c7-51c09b869b36-kube-api-access-zd44l\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.479057 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:36:56 crc kubenswrapper[4779]: I0320 15:36:56.655745 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5"] Mar 20 15:36:57 crc kubenswrapper[4779]: I0320 15:36:57.110795 4779 generic.go:334] "Generic (PLEG): container finished" podID="11a586de-ae83-4a62-92c7-51c09b869b36" containerID="6ae7c05edc5fb15ec6bd4b17e08a0944135854d1d4553cee78fdff69733a3c1c" exitCode=0 Mar 20 15:36:57 crc kubenswrapper[4779]: I0320 15:36:57.110844 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" event={"ID":"11a586de-ae83-4a62-92c7-51c09b869b36","Type":"ContainerDied","Data":"6ae7c05edc5fb15ec6bd4b17e08a0944135854d1d4553cee78fdff69733a3c1c"} Mar 20 15:36:57 crc kubenswrapper[4779]: I0320 15:36:57.110886 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" event={"ID":"11a586de-ae83-4a62-92c7-51c09b869b36","Type":"ContainerStarted","Data":"564d65ef37235814a6d54b76d616217191c967667e6b4a5aaea0d9a279cda2eb"} Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.513294 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-scl8h"] Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.518315 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.522729 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scl8h"] Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.588945 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2nqg\" (UniqueName: \"kubernetes.io/projected/209da1d2-e7be-409b-9c46-62dbd5065706-kube-api-access-m2nqg\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.589505 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-utilities\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.589697 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-catalog-content\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.691397 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2nqg\" (UniqueName: \"kubernetes.io/projected/209da1d2-e7be-409b-9c46-62dbd5065706-kube-api-access-m2nqg\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.691487 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-utilities\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.691932 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-utilities\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.691994 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-catalog-content\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.692274 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-catalog-content\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.712273 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2nqg\" (UniqueName: \"kubernetes.io/projected/209da1d2-e7be-409b-9c46-62dbd5065706-kube-api-access-m2nqg\") pod \"redhat-operators-scl8h\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:58 crc kubenswrapper[4779]: I0320 15:36:58.847144 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:36:59 crc kubenswrapper[4779]: I0320 15:36:59.043500 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scl8h"] Mar 20 15:36:59 crc kubenswrapper[4779]: I0320 15:36:59.122793 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scl8h" event={"ID":"209da1d2-e7be-409b-9c46-62dbd5065706","Type":"ContainerStarted","Data":"cf86f66d6dafbf11474ef6d27111f32c753fcd77a4c930ceabece73782ff1cf2"} Mar 20 15:36:59 crc kubenswrapper[4779]: I0320 15:36:59.124803 4779 generic.go:334] "Generic (PLEG): container finished" podID="11a586de-ae83-4a62-92c7-51c09b869b36" containerID="767f6f0f75a5a24f8a139acdbca8d0fda4803112a79d02ce0d309f4d416ee9e6" exitCode=0 Mar 20 15:36:59 crc kubenswrapper[4779]: I0320 15:36:59.124840 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" event={"ID":"11a586de-ae83-4a62-92c7-51c09b869b36","Type":"ContainerDied","Data":"767f6f0f75a5a24f8a139acdbca8d0fda4803112a79d02ce0d309f4d416ee9e6"} Mar 20 15:37:00 crc kubenswrapper[4779]: I0320 15:37:00.154845 4779 generic.go:334] "Generic (PLEG): container finished" podID="11a586de-ae83-4a62-92c7-51c09b869b36" containerID="563fa3f0f08cd56d670bd3a8d3eb82f366fe7f52ed127f940ea8c6045d809bbb" exitCode=0 Mar 20 15:37:00 crc kubenswrapper[4779]: I0320 15:37:00.154900 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" event={"ID":"11a586de-ae83-4a62-92c7-51c09b869b36","Type":"ContainerDied","Data":"563fa3f0f08cd56d670bd3a8d3eb82f366fe7f52ed127f940ea8c6045d809bbb"} Mar 20 15:37:00 crc kubenswrapper[4779]: I0320 15:37:00.156205 4779 generic.go:334] "Generic (PLEG): container finished" podID="209da1d2-e7be-409b-9c46-62dbd5065706" containerID="8ad6dd353fbcb731cad117ec7d7ef30d9ed3d79021595e7168943a7b74e2c255" exitCode=0 Mar 20 15:37:00 crc kubenswrapper[4779]: I0320 15:37:00.156226 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scl8h" event={"ID":"209da1d2-e7be-409b-9c46-62dbd5065706","Type":"ContainerDied","Data":"8ad6dd353fbcb731cad117ec7d7ef30d9ed3d79021595e7168943a7b74e2c255"} Mar 20 15:37:00 crc kubenswrapper[4779]: I0320 15:37:00.883004 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nxblr" Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.411296 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.525052 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-util\") pod \"11a586de-ae83-4a62-92c7-51c09b869b36\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.525172 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-bundle\") pod \"11a586de-ae83-4a62-92c7-51c09b869b36\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.525207 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd44l\" (UniqueName: \"kubernetes.io/projected/11a586de-ae83-4a62-92c7-51c09b869b36-kube-api-access-zd44l\") pod \"11a586de-ae83-4a62-92c7-51c09b869b36\" (UID: \"11a586de-ae83-4a62-92c7-51c09b869b36\") " Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.527625 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-bundle" (OuterVolumeSpecName: "bundle") pod "11a586de-ae83-4a62-92c7-51c09b869b36" (UID: "11a586de-ae83-4a62-92c7-51c09b869b36"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.530902 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a586de-ae83-4a62-92c7-51c09b869b36-kube-api-access-zd44l" (OuterVolumeSpecName: "kube-api-access-zd44l") pod "11a586de-ae83-4a62-92c7-51c09b869b36" (UID: "11a586de-ae83-4a62-92c7-51c09b869b36"). InnerVolumeSpecName "kube-api-access-zd44l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.539974 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-util" (OuterVolumeSpecName: "util") pod "11a586de-ae83-4a62-92c7-51c09b869b36" (UID: "11a586de-ae83-4a62-92c7-51c09b869b36"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.626811 4779 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-util\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.626857 4779 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11a586de-ae83-4a62-92c7-51c09b869b36-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:01 crc kubenswrapper[4779]: I0320 15:37:01.626866 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd44l\" (UniqueName: \"kubernetes.io/projected/11a586de-ae83-4a62-92c7-51c09b869b36-kube-api-access-zd44l\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:02 crc kubenswrapper[4779]: I0320 15:37:02.168049 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" event={"ID":"11a586de-ae83-4a62-92c7-51c09b869b36","Type":"ContainerDied","Data":"564d65ef37235814a6d54b76d616217191c967667e6b4a5aaea0d9a279cda2eb"} Mar 20 15:37:02 crc kubenswrapper[4779]: I0320 15:37:02.168070 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5" Mar 20 15:37:02 crc kubenswrapper[4779]: I0320 15:37:02.168092 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="564d65ef37235814a6d54b76d616217191c967667e6b4a5aaea0d9a279cda2eb" Mar 20 15:37:02 crc kubenswrapper[4779]: I0320 15:37:02.169688 4779 generic.go:334] "Generic (PLEG): container finished" podID="209da1d2-e7be-409b-9c46-62dbd5065706" containerID="56aa48e42159aa4090d437e74f5d3d26ef6d47bb4f899ec1e9f80b1b4eba7a17" exitCode=0 Mar 20 15:37:02 crc kubenswrapper[4779]: I0320 15:37:02.169714 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scl8h" event={"ID":"209da1d2-e7be-409b-9c46-62dbd5065706","Type":"ContainerDied","Data":"56aa48e42159aa4090d437e74f5d3d26ef6d47bb4f899ec1e9f80b1b4eba7a17"} Mar 20 15:37:02 crc kubenswrapper[4779]: I0320 15:37:02.171434 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:37:03 crc kubenswrapper[4779]: I0320 15:37:03.176405 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scl8h" event={"ID":"209da1d2-e7be-409b-9c46-62dbd5065706","Type":"ContainerStarted","Data":"48861b4b053481525d017339f8712221ec42c6470a6e0c26c9f30fd29a8a4bdd"} Mar 20 15:37:03 crc kubenswrapper[4779]: I0320 15:37:03.194277 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-scl8h" podStartSLOduration=2.691118759 podStartE2EDuration="5.194257222s" podCreationTimestamp="2026-03-20 15:36:58 +0000 UTC" firstStartedPulling="2026-03-20 15:37:00.157540549 +0000 UTC m=+837.120056349" lastFinishedPulling="2026-03-20 15:37:02.660679012 +0000 UTC m=+839.623194812" observedRunningTime="2026-03-20 15:37:03.192552981 +0000 UTC m=+840.155068781" watchObservedRunningTime="2026-03-20 15:37:03.194257222 +0000 UTC m=+840.156773022" Mar 20 15:37:08 crc kubenswrapper[4779]: I0320 15:37:08.847726 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:37:08 crc kubenswrapper[4779]: I0320 15:37:08.848903 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:37:09 crc kubenswrapper[4779]: I0320 15:37:09.914919 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-scl8h" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="registry-server" probeResult="failure" output=< Mar 20 15:37:09 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 15:37:09 crc kubenswrapper[4779]: > Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.688369 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm"] Mar 20 15:37:14 crc kubenswrapper[4779]: E0320 15:37:14.688941 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a586de-ae83-4a62-92c7-51c09b869b36" containerName="extract" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.688957 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a586de-ae83-4a62-92c7-51c09b869b36" containerName="extract" Mar 20 15:37:14 crc kubenswrapper[4779]: E0320 15:37:14.688976 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a586de-ae83-4a62-92c7-51c09b869b36" containerName="pull" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.688984 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a586de-ae83-4a62-92c7-51c09b869b36" containerName="pull" Mar 20 15:37:14 crc kubenswrapper[4779]: E0320 15:37:14.688998 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a586de-ae83-4a62-92c7-51c09b869b36" containerName="util" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.689007 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a586de-ae83-4a62-92c7-51c09b869b36" containerName="util" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.689165 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a586de-ae83-4a62-92c7-51c09b869b36" containerName="extract" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.689655 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.700009 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.700271 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.700477 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-89qcn" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.711561 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm"] Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.779546 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2bv4\" (UniqueName: \"kubernetes.io/projected/569c99c8-036a-41d3-9068-e893c1e067fb-kube-api-access-l2bv4\") pod \"obo-prometheus-operator-8ff7d675-sqzkm\" (UID: \"569c99c8-036a-41d3-9068-e893c1e067fb\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.880689 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2bv4\" (UniqueName: \"kubernetes.io/projected/569c99c8-036a-41d3-9068-e893c1e067fb-kube-api-access-l2bv4\") pod \"obo-prometheus-operator-8ff7d675-sqzkm\" (UID: \"569c99c8-036a-41d3-9068-e893c1e067fb\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm" Mar 20 15:37:14 crc kubenswrapper[4779]: I0320 15:37:14.923950 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2bv4\" (UniqueName: \"kubernetes.io/projected/569c99c8-036a-41d3-9068-e893c1e067fb-kube-api-access-l2bv4\") pod \"obo-prometheus-operator-8ff7d675-sqzkm\" (UID: \"569c99c8-036a-41d3-9068-e893c1e067fb\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.017328 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.072939 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.073745 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.077182 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.077192 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-6jklg" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.077797 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.078663 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.083558 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa1321a8-47aa-4e23-88a9-9fbc2eabe628-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t\" (UID: \"aa1321a8-47aa-4e23-88a9-9fbc2eabe628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.083632 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa1321a8-47aa-4e23-88a9-9fbc2eabe628-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t\" (UID: \"aa1321a8-47aa-4e23-88a9-9fbc2eabe628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.099325 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.111380 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.197407 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa1321a8-47aa-4e23-88a9-9fbc2eabe628-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t\" (UID: \"aa1321a8-47aa-4e23-88a9-9fbc2eabe628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.197453 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30bff49f-d894-4d30-84f6-b845f0f3dbe2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj\" (UID: \"30bff49f-d894-4d30-84f6-b845f0f3dbe2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.197483 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30bff49f-d894-4d30-84f6-b845f0f3dbe2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj\" (UID: \"30bff49f-d894-4d30-84f6-b845f0f3dbe2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.197540 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa1321a8-47aa-4e23-88a9-9fbc2eabe628-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t\" (UID: \"aa1321a8-47aa-4e23-88a9-9fbc2eabe628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.202769 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa1321a8-47aa-4e23-88a9-9fbc2eabe628-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t\" (UID: \"aa1321a8-47aa-4e23-88a9-9fbc2eabe628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.216574 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa1321a8-47aa-4e23-88a9-9fbc2eabe628-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t\" (UID: \"aa1321a8-47aa-4e23-88a9-9fbc2eabe628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.298824 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30bff49f-d894-4d30-84f6-b845f0f3dbe2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj\" (UID: \"30bff49f-d894-4d30-84f6-b845f0f3dbe2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.298894 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30bff49f-d894-4d30-84f6-b845f0f3dbe2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj\" (UID: \"30bff49f-d894-4d30-84f6-b845f0f3dbe2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.302704 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30bff49f-d894-4d30-84f6-b845f0f3dbe2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj\" (UID: \"30bff49f-d894-4d30-84f6-b845f0f3dbe2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.303202 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30bff49f-d894-4d30-84f6-b845f0f3dbe2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj\" (UID: \"30bff49f-d894-4d30-84f6-b845f0f3dbe2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.391037 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.399054 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.402147 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-rv4tl"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.403027 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.405178 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-rdqdx" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.405534 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.471197 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-rv4tl"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.500856 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f1f80bc-9361-4642-84e6-99e781ac2fad-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-rv4tl\" (UID: \"5f1f80bc-9361-4642-84e6-99e781ac2fad\") " pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.500902 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nsd2\" (UniqueName: \"kubernetes.io/projected/5f1f80bc-9361-4642-84e6-99e781ac2fad-kube-api-access-8nsd2\") pod \"observability-operator-6dd7dd855f-rv4tl\" (UID: \"5f1f80bc-9361-4642-84e6-99e781ac2fad\") " pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.539947 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.602506 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f1f80bc-9361-4642-84e6-99e781ac2fad-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-rv4tl\" (UID: \"5f1f80bc-9361-4642-84e6-99e781ac2fad\") " pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.602556 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nsd2\" (UniqueName: \"kubernetes.io/projected/5f1f80bc-9361-4642-84e6-99e781ac2fad-kube-api-access-8nsd2\") pod \"observability-operator-6dd7dd855f-rv4tl\" (UID: \"5f1f80bc-9361-4642-84e6-99e781ac2fad\") " pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.608162 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f1f80bc-9361-4642-84e6-99e781ac2fad-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-rv4tl\" (UID: \"5f1f80bc-9361-4642-84e6-99e781ac2fad\") " pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.635713 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nsd2\" (UniqueName: \"kubernetes.io/projected/5f1f80bc-9361-4642-84e6-99e781ac2fad-kube-api-access-8nsd2\") pod \"observability-operator-6dd7dd855f-rv4tl\" (UID: \"5f1f80bc-9361-4642-84e6-99e781ac2fad\") " pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.706786 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.723558 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.791174 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj"] Mar 20 15:37:15 crc kubenswrapper[4779]: W0320 15:37:15.804430 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30bff49f_d894_4d30_84f6_b845f0f3dbe2.slice/crio-49ba324508653fb075503d508a226921d73b307158798aac8fde8bff881b4f95 WatchSource:0}: Error finding container 49ba324508653fb075503d508a226921d73b307158798aac8fde8bff881b4f95: Status 404 returned error can't find the container with id 49ba324508653fb075503d508a226921d73b307158798aac8fde8bff881b4f95 Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.853682 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-b59bd5fff-kt6dg"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.854409 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.856467 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.858724 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-44prz" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.873407 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-b59bd5fff-kt6dg"] Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.908893 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8cf9bae4-474b-4dcd-9716-840649b33f8b-apiservice-cert\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.908959 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8cf9bae4-474b-4dcd-9716-840649b33f8b-webhook-cert\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.909226 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8cf9bae4-474b-4dcd-9716-840649b33f8b-openshift-service-ca\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:15 crc kubenswrapper[4779]: I0320 15:37:15.909317 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82gmz\" (UniqueName: \"kubernetes.io/projected/8cf9bae4-474b-4dcd-9716-840649b33f8b-kube-api-access-82gmz\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.010350 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8cf9bae4-474b-4dcd-9716-840649b33f8b-openshift-service-ca\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.010752 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82gmz\" (UniqueName: \"kubernetes.io/projected/8cf9bae4-474b-4dcd-9716-840649b33f8b-kube-api-access-82gmz\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.010802 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8cf9bae4-474b-4dcd-9716-840649b33f8b-apiservice-cert\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.010837 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8cf9bae4-474b-4dcd-9716-840649b33f8b-webhook-cert\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.011792 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8cf9bae4-474b-4dcd-9716-840649b33f8b-openshift-service-ca\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.015180 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8cf9bae4-474b-4dcd-9716-840649b33f8b-webhook-cert\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.029588 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8cf9bae4-474b-4dcd-9716-840649b33f8b-apiservice-cert\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.030901 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82gmz\" (UniqueName: \"kubernetes.io/projected/8cf9bae4-474b-4dcd-9716-840649b33f8b-kube-api-access-82gmz\") pod \"perses-operator-b59bd5fff-kt6dg\" (UID: \"8cf9bae4-474b-4dcd-9716-840649b33f8b\") " pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.170366 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.258633 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm" event={"ID":"569c99c8-036a-41d3-9068-e893c1e067fb","Type":"ContainerStarted","Data":"612f37f3ba56e1997b6aa0734e9bcdaba6101c5d83f06ac39fb21ccb5b74e405"} Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.263055 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-rv4tl"] Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.264504 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" event={"ID":"30bff49f-d894-4d30-84f6-b845f0f3dbe2","Type":"ContainerStarted","Data":"49ba324508653fb075503d508a226921d73b307158798aac8fde8bff881b4f95"} Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.272798 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" event={"ID":"aa1321a8-47aa-4e23-88a9-9fbc2eabe628","Type":"ContainerStarted","Data":"52f564c338416c728075c95f21bdba59ea53af046b14b4b863d66cf2b2fd2bfc"} Mar 20 15:37:16 crc kubenswrapper[4779]: I0320 15:37:16.530608 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-b59bd5fff-kt6dg"] Mar 20 15:37:16 crc kubenswrapper[4779]: W0320 15:37:16.539243 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf9bae4_474b_4dcd_9716_840649b33f8b.slice/crio-3ea128895353a18c9fd3f64f5dfc2d275cc9b15c50d7c313368497006960c6df WatchSource:0}: Error finding container 3ea128895353a18c9fd3f64f5dfc2d275cc9b15c50d7c313368497006960c6df: Status 404 returned error can't find the container with id 3ea128895353a18c9fd3f64f5dfc2d275cc9b15c50d7c313368497006960c6df Mar 20 15:37:17 crc kubenswrapper[4779]: I0320 15:37:17.279984 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" event={"ID":"5f1f80bc-9361-4642-84e6-99e781ac2fad","Type":"ContainerStarted","Data":"dd54946db433cf35646bfd58c8bc5fefefc1321d2ac107adf2e099f38c18e6cb"} Mar 20 15:37:17 crc kubenswrapper[4779]: I0320 15:37:17.281067 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" event={"ID":"8cf9bae4-474b-4dcd-9716-840649b33f8b","Type":"ContainerStarted","Data":"3ea128895353a18c9fd3f64f5dfc2d275cc9b15c50d7c313368497006960c6df"} Mar 20 15:37:18 crc kubenswrapper[4779]: I0320 15:37:18.903552 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:37:18 crc kubenswrapper[4779]: I0320 15:37:18.946376 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:37:21 crc kubenswrapper[4779]: I0320 15:37:21.501654 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scl8h"] Mar 20 15:37:21 crc kubenswrapper[4779]: I0320 15:37:21.501984 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-scl8h" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="registry-server" containerID="cri-o://48861b4b053481525d017339f8712221ec42c6470a6e0c26c9f30fd29a8a4bdd" gracePeriod=2 Mar 20 15:37:22 crc kubenswrapper[4779]: E0320 15:37:22.057558 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod209da1d2_e7be_409b_9c46_62dbd5065706.slice/crio-conmon-48861b4b053481525d017339f8712221ec42c6470a6e0c26c9f30fd29a8a4bdd.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:37:22 crc kubenswrapper[4779]: I0320 15:37:22.312481 4779 generic.go:334] "Generic (PLEG): container finished" podID="209da1d2-e7be-409b-9c46-62dbd5065706" containerID="48861b4b053481525d017339f8712221ec42c6470a6e0c26c9f30fd29a8a4bdd" exitCode=0 Mar 20 15:37:22 crc kubenswrapper[4779]: I0320 15:37:22.312529 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scl8h" event={"ID":"209da1d2-e7be-409b-9c46-62dbd5065706","Type":"ContainerDied","Data":"48861b4b053481525d017339f8712221ec42c6470a6e0c26c9f30fd29a8a4bdd"} Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.691199 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.737884 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-utilities\") pod \"209da1d2-e7be-409b-9c46-62dbd5065706\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.737954 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2nqg\" (UniqueName: \"kubernetes.io/projected/209da1d2-e7be-409b-9c46-62dbd5065706-kube-api-access-m2nqg\") pod \"209da1d2-e7be-409b-9c46-62dbd5065706\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.737988 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-catalog-content\") pod \"209da1d2-e7be-409b-9c46-62dbd5065706\" (UID: \"209da1d2-e7be-409b-9c46-62dbd5065706\") " Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.738748 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-utilities" (OuterVolumeSpecName: "utilities") pod "209da1d2-e7be-409b-9c46-62dbd5065706" (UID: "209da1d2-e7be-409b-9c46-62dbd5065706"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.747632 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/209da1d2-e7be-409b-9c46-62dbd5065706-kube-api-access-m2nqg" (OuterVolumeSpecName: "kube-api-access-m2nqg") pod "209da1d2-e7be-409b-9c46-62dbd5065706" (UID: "209da1d2-e7be-409b-9c46-62dbd5065706"). InnerVolumeSpecName "kube-api-access-m2nqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.839883 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.840237 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2nqg\" (UniqueName: \"kubernetes.io/projected/209da1d2-e7be-409b-9c46-62dbd5065706-kube-api-access-m2nqg\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.879806 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "209da1d2-e7be-409b-9c46-62dbd5065706" (UID: "209da1d2-e7be-409b-9c46-62dbd5065706"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:37:24 crc kubenswrapper[4779]: I0320 15:37:24.942297 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209da1d2-e7be-409b-9c46-62dbd5065706-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.149539 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.149589 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.149628 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.150066 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6987ea5d5bc8dacf20b1f5a9f2e4c3b448070d7ac94c29c0b4e57a5ce5ecc6b5"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.150138 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://6987ea5d5bc8dacf20b1f5a9f2e4c3b448070d7ac94c29c0b4e57a5ce5ecc6b5" gracePeriod=600 Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.331630 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="6987ea5d5bc8dacf20b1f5a9f2e4c3b448070d7ac94c29c0b4e57a5ce5ecc6b5" exitCode=0 Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.331671 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"6987ea5d5bc8dacf20b1f5a9f2e4c3b448070d7ac94c29c0b4e57a5ce5ecc6b5"} Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.331710 4779 scope.go:117] "RemoveContainer" containerID="4ac21e339673c7860531784a8ee71a61f14bdd346b0fb848a3ba1fb383a92288" Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.333868 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scl8h" event={"ID":"209da1d2-e7be-409b-9c46-62dbd5065706","Type":"ContainerDied","Data":"cf86f66d6dafbf11474ef6d27111f32c753fcd77a4c930ceabece73782ff1cf2"} Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.333925 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scl8h" Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.367639 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scl8h"] Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.371592 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-scl8h"] Mar 20 15:37:25 crc kubenswrapper[4779]: I0320 15:37:25.815197 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" path="/var/lib/kubelet/pods/209da1d2-e7be-409b-9c46-62dbd5065706/volumes" Mar 20 15:37:26 crc kubenswrapper[4779]: I0320 15:37:26.742879 4779 scope.go:117] "RemoveContainer" containerID="48861b4b053481525d017339f8712221ec42c6470a6e0c26c9f30fd29a8a4bdd" Mar 20 15:37:26 crc kubenswrapper[4779]: I0320 15:37:26.773292 4779 scope.go:117] "RemoveContainer" containerID="56aa48e42159aa4090d437e74f5d3d26ef6d47bb4f899ec1e9f80b1b4eba7a17" Mar 20 15:37:26 crc kubenswrapper[4779]: I0320 15:37:26.843314 4779 scope.go:117] "RemoveContainer" containerID="8ad6dd353fbcb731cad117ec7d7ef30d9ed3d79021595e7168943a7b74e2c255" Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.344928 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" event={"ID":"30bff49f-d894-4d30-84f6-b845f0f3dbe2","Type":"ContainerStarted","Data":"bdbbbd0a24f7859e9728c7062c109739e86f8283d07a966fdf9ddef130f78fe7"} Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.347544 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" event={"ID":"aa1321a8-47aa-4e23-88a9-9fbc2eabe628","Type":"ContainerStarted","Data":"61fa7f8ed7f54b0daa6799bbe758b0b6ad3fbb69cd578c730cd2f759f5ddf2ad"} Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.349064 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" event={"ID":"5f1f80bc-9361-4642-84e6-99e781ac2fad","Type":"ContainerStarted","Data":"db391d2cb65585e79185eec8aa55c547d303bee595fdd86b251ecba650e006e8"} Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.349347 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.351795 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" event={"ID":"8cf9bae4-474b-4dcd-9716-840649b33f8b","Type":"ContainerStarted","Data":"1f9e3490ce2a8d3a22bcd7824b873cf46e00affd39a539f4a32a4df19782c6ec"} Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.351893 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.353820 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"d329625e139412c780d56ee2166e7046f52ce6cce2fdaf917fadbaf979acbe6a"} Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.355394 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm" event={"ID":"569c99c8-036a-41d3-9068-e893c1e067fb","Type":"ContainerStarted","Data":"2637df7415a9595e51c9f90b97c53ee9bb6aa5994383a0bd5f50da1b491e9609"} Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.355933 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.376471 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj" podStartSLOduration=1.45402562 podStartE2EDuration="12.37645645s" podCreationTimestamp="2026-03-20 15:37:15 +0000 UTC" firstStartedPulling="2026-03-20 15:37:15.827732477 +0000 UTC m=+852.790248277" lastFinishedPulling="2026-03-20 15:37:26.750163297 +0000 UTC m=+863.712679107" observedRunningTime="2026-03-20 15:37:27.372165541 +0000 UTC m=+864.334681341" watchObservedRunningTime="2026-03-20 15:37:27.37645645 +0000 UTC m=+864.338972250" Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.414359 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-rv4tl" podStartSLOduration=1.92038606 podStartE2EDuration="12.414340285s" podCreationTimestamp="2026-03-20 15:37:15 +0000 UTC" firstStartedPulling="2026-03-20 15:37:16.300654924 +0000 UTC m=+853.263170724" lastFinishedPulling="2026-03-20 15:37:26.794609139 +0000 UTC m=+863.757124949" observedRunningTime="2026-03-20 15:37:27.413098774 +0000 UTC m=+864.375614564" watchObservedRunningTime="2026-03-20 15:37:27.414340285 +0000 UTC m=+864.376856085" Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.487237 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-sqzkm" podStartSLOduration=2.302548191 podStartE2EDuration="13.487222502s" podCreationTimestamp="2026-03-20 15:37:14 +0000 UTC" firstStartedPulling="2026-03-20 15:37:15.572077914 +0000 UTC m=+852.534593714" lastFinishedPulling="2026-03-20 15:37:26.756752225 +0000 UTC m=+863.719268025" observedRunningTime="2026-03-20 15:37:27.457087275 +0000 UTC m=+864.419603075" watchObservedRunningTime="2026-03-20 15:37:27.487222502 +0000 UTC m=+864.449738302" Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.506740 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t" podStartSLOduration=1.511549486 podStartE2EDuration="12.506726609s" podCreationTimestamp="2026-03-20 15:37:15 +0000 UTC" firstStartedPulling="2026-03-20 15:37:15.76150941 +0000 UTC m=+852.724025210" lastFinishedPulling="2026-03-20 15:37:26.756686543 +0000 UTC m=+863.719202333" observedRunningTime="2026-03-20 15:37:27.504538114 +0000 UTC m=+864.467053914" watchObservedRunningTime="2026-03-20 15:37:27.506726609 +0000 UTC m=+864.469242409" Mar 20 15:37:27 crc kubenswrapper[4779]: I0320 15:37:27.526916 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" podStartSLOduration=2.318733648 podStartE2EDuration="12.526901473s" podCreationTimestamp="2026-03-20 15:37:15 +0000 UTC" firstStartedPulling="2026-03-20 15:37:16.541976831 +0000 UTC m=+853.504492631" lastFinishedPulling="2026-03-20 15:37:26.750144656 +0000 UTC m=+863.712660456" observedRunningTime="2026-03-20 15:37:27.525423086 +0000 UTC m=+864.487938886" watchObservedRunningTime="2026-03-20 15:37:27.526901473 +0000 UTC m=+864.489417273" Mar 20 15:37:36 crc kubenswrapper[4779]: I0320 15:37:36.173154 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-b59bd5fff-kt6dg" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.170269 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n"] Mar 20 15:37:53 crc kubenswrapper[4779]: E0320 15:37:53.171531 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="registry-server" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.171554 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="registry-server" Mar 20 15:37:53 crc kubenswrapper[4779]: E0320 15:37:53.171568 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="extract-utilities" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.171578 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="extract-utilities" Mar 20 15:37:53 crc kubenswrapper[4779]: E0320 15:37:53.171601 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="extract-content" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.171612 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="extract-content" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.171817 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="209da1d2-e7be-409b-9c46-62dbd5065706" containerName="registry-server" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.173276 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.180680 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n"] Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.181209 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.286375 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.286465 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqtg5\" (UniqueName: \"kubernetes.io/projected/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-kube-api-access-xqtg5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.286502 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.387931 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.387990 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.388051 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqtg5\" (UniqueName: \"kubernetes.io/projected/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-kube-api-access-xqtg5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.388570 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.388624 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.411306 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqtg5\" (UniqueName: \"kubernetes.io/projected/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-kube-api-access-xqtg5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.492323 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:37:53 crc kubenswrapper[4779]: I0320 15:37:53.699036 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n"] Mar 20 15:37:54 crc kubenswrapper[4779]: I0320 15:37:54.490570 4779 generic.go:334] "Generic (PLEG): container finished" podID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerID="92e9e06aeb61c75a89d14f208d582d8abdd77e62bf9c5f690c2f157c3353332c" exitCode=0 Mar 20 15:37:54 crc kubenswrapper[4779]: I0320 15:37:54.490619 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" event={"ID":"1bffdf90-d3df-4cfb-8484-a70e0c418f5c","Type":"ContainerDied","Data":"92e9e06aeb61c75a89d14f208d582d8abdd77e62bf9c5f690c2f157c3353332c"} Mar 20 15:37:54 crc kubenswrapper[4779]: I0320 15:37:54.490892 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" event={"ID":"1bffdf90-d3df-4cfb-8484-a70e0c418f5c","Type":"ContainerStarted","Data":"5553d0c016cf93de24071708da3444bd0483ad75ce6e826a851e2a09e8009056"} Mar 20 15:37:56 crc kubenswrapper[4779]: I0320 15:37:56.501813 4779 generic.go:334] "Generic (PLEG): container finished" podID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerID="b576eb68c9f4be7d1934e8585c1c4e5f97f64d61ddbeed4395fc1b599246c503" exitCode=0 Mar 20 15:37:56 crc kubenswrapper[4779]: I0320 15:37:56.501896 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" event={"ID":"1bffdf90-d3df-4cfb-8484-a70e0c418f5c","Type":"ContainerDied","Data":"b576eb68c9f4be7d1934e8585c1c4e5f97f64d61ddbeed4395fc1b599246c503"} Mar 20 15:37:59 crc kubenswrapper[4779]: I0320 15:37:59.527470 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" event={"ID":"1bffdf90-d3df-4cfb-8484-a70e0c418f5c","Type":"ContainerStarted","Data":"30ab815841cafa1997771f649e355f9e2ea32159454febaf2b4801a90a3adc2a"} Mar 20 15:37:59 crc kubenswrapper[4779]: I0320 15:37:59.545470 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" podStartSLOduration=4.892022991 podStartE2EDuration="6.545451718s" podCreationTimestamp="2026-03-20 15:37:53 +0000 UTC" firstStartedPulling="2026-03-20 15:37:54.49195424 +0000 UTC m=+891.454470040" lastFinishedPulling="2026-03-20 15:37:56.145382967 +0000 UTC m=+893.107898767" observedRunningTime="2026-03-20 15:37:59.541060946 +0000 UTC m=+896.503576766" watchObservedRunningTime="2026-03-20 15:37:59.545451718 +0000 UTC m=+896.507967518" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.131269 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567018-k964v"] Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.131892 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567018-k964v" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.133867 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.134554 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.134846 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.141428 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567018-k964v"] Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.185681 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k25fn\" (UniqueName: \"kubernetes.io/projected/a49b764a-39a5-4961-aa89-4f99c8b155a4-kube-api-access-k25fn\") pod \"auto-csr-approver-29567018-k964v\" (UID: \"a49b764a-39a5-4961-aa89-4f99c8b155a4\") " pod="openshift-infra/auto-csr-approver-29567018-k964v" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.286635 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k25fn\" (UniqueName: \"kubernetes.io/projected/a49b764a-39a5-4961-aa89-4f99c8b155a4-kube-api-access-k25fn\") pod \"auto-csr-approver-29567018-k964v\" (UID: \"a49b764a-39a5-4961-aa89-4f99c8b155a4\") " pod="openshift-infra/auto-csr-approver-29567018-k964v" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.303345 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k25fn\" (UniqueName: \"kubernetes.io/projected/a49b764a-39a5-4961-aa89-4f99c8b155a4-kube-api-access-k25fn\") pod \"auto-csr-approver-29567018-k964v\" (UID: \"a49b764a-39a5-4961-aa89-4f99c8b155a4\") " pod="openshift-infra/auto-csr-approver-29567018-k964v" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.487592 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567018-k964v" Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.535564 4779 generic.go:334] "Generic (PLEG): container finished" podID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerID="30ab815841cafa1997771f649e355f9e2ea32159454febaf2b4801a90a3adc2a" exitCode=0 Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.536647 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" event={"ID":"1bffdf90-d3df-4cfb-8484-a70e0c418f5c","Type":"ContainerDied","Data":"30ab815841cafa1997771f649e355f9e2ea32159454febaf2b4801a90a3adc2a"} Mar 20 15:38:00 crc kubenswrapper[4779]: I0320 15:38:00.866547 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567018-k964v"] Mar 20 15:38:01 crc kubenswrapper[4779]: W0320 15:38:01.041627 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda49b764a_39a5_4961_aa89_4f99c8b155a4.slice/crio-960477a05985d7d74a5ae6d2ffc10a681ee3ac474ea8415e82b8a3eafccadf64 WatchSource:0}: Error finding container 960477a05985d7d74a5ae6d2ffc10a681ee3ac474ea8415e82b8a3eafccadf64: Status 404 returned error can't find the container with id 960477a05985d7d74a5ae6d2ffc10a681ee3ac474ea8415e82b8a3eafccadf64 Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.542991 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567018-k964v" event={"ID":"a49b764a-39a5-4961-aa89-4f99c8b155a4","Type":"ContainerStarted","Data":"960477a05985d7d74a5ae6d2ffc10a681ee3ac474ea8415e82b8a3eafccadf64"} Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.743935 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.809126 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-util\") pod \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.809534 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-bundle\") pod \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.809567 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqtg5\" (UniqueName: \"kubernetes.io/projected/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-kube-api-access-xqtg5\") pod \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\" (UID: \"1bffdf90-d3df-4cfb-8484-a70e0c418f5c\") " Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.810434 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-bundle" (OuterVolumeSpecName: "bundle") pod "1bffdf90-d3df-4cfb-8484-a70e0c418f5c" (UID: "1bffdf90-d3df-4cfb-8484-a70e0c418f5c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.814910 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-kube-api-access-xqtg5" (OuterVolumeSpecName: "kube-api-access-xqtg5") pod "1bffdf90-d3df-4cfb-8484-a70e0c418f5c" (UID: "1bffdf90-d3df-4cfb-8484-a70e0c418f5c"). InnerVolumeSpecName "kube-api-access-xqtg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.819873 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-util" (OuterVolumeSpecName: "util") pod "1bffdf90-d3df-4cfb-8484-a70e0c418f5c" (UID: "1bffdf90-d3df-4cfb-8484-a70e0c418f5c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.911330 4779 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-util\") on node \"crc\" DevicePath \"\"" Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.911362 4779 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:38:01 crc kubenswrapper[4779]: I0320 15:38:01.911371 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqtg5\" (UniqueName: \"kubernetes.io/projected/1bffdf90-d3df-4cfb-8484-a70e0c418f5c-kube-api-access-xqtg5\") on node \"crc\" DevicePath \"\"" Mar 20 15:38:02 crc kubenswrapper[4779]: I0320 15:38:02.549971 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" event={"ID":"1bffdf90-d3df-4cfb-8484-a70e0c418f5c","Type":"ContainerDied","Data":"5553d0c016cf93de24071708da3444bd0483ad75ce6e826a851e2a09e8009056"} Mar 20 15:38:02 crc kubenswrapper[4779]: I0320 15:38:02.550010 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5553d0c016cf93de24071708da3444bd0483ad75ce6e826a851e2a09e8009056" Mar 20 15:38:02 crc kubenswrapper[4779]: I0320 15:38:02.550065 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n" Mar 20 15:38:02 crc kubenswrapper[4779]: I0320 15:38:02.551902 4779 generic.go:334] "Generic (PLEG): container finished" podID="a49b764a-39a5-4961-aa89-4f99c8b155a4" containerID="1ce0cc7d8c7f15a8b28a4cfc94a8459144a84fabf49f45997b3386ddd8e21045" exitCode=0 Mar 20 15:38:02 crc kubenswrapper[4779]: I0320 15:38:02.551926 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567018-k964v" event={"ID":"a49b764a-39a5-4961-aa89-4f99c8b155a4","Type":"ContainerDied","Data":"1ce0cc7d8c7f15a8b28a4cfc94a8459144a84fabf49f45997b3386ddd8e21045"} Mar 20 15:38:03 crc kubenswrapper[4779]: I0320 15:38:03.779637 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567018-k964v" Mar 20 15:38:03 crc kubenswrapper[4779]: I0320 15:38:03.843206 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k25fn\" (UniqueName: \"kubernetes.io/projected/a49b764a-39a5-4961-aa89-4f99c8b155a4-kube-api-access-k25fn\") pod \"a49b764a-39a5-4961-aa89-4f99c8b155a4\" (UID: \"a49b764a-39a5-4961-aa89-4f99c8b155a4\") " Mar 20 15:38:03 crc kubenswrapper[4779]: I0320 15:38:03.860404 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49b764a-39a5-4961-aa89-4f99c8b155a4-kube-api-access-k25fn" (OuterVolumeSpecName: "kube-api-access-k25fn") pod "a49b764a-39a5-4961-aa89-4f99c8b155a4" (UID: "a49b764a-39a5-4961-aa89-4f99c8b155a4"). InnerVolumeSpecName "kube-api-access-k25fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:38:03 crc kubenswrapper[4779]: I0320 15:38:03.945354 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k25fn\" (UniqueName: \"kubernetes.io/projected/a49b764a-39a5-4961-aa89-4f99c8b155a4-kube-api-access-k25fn\") on node \"crc\" DevicePath \"\"" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.308173 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz"] Mar 20 15:38:04 crc kubenswrapper[4779]: E0320 15:38:04.308433 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49b764a-39a5-4961-aa89-4f99c8b155a4" containerName="oc" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.308448 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49b764a-39a5-4961-aa89-4f99c8b155a4" containerName="oc" Mar 20 15:38:04 crc kubenswrapper[4779]: E0320 15:38:04.308468 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerName="extract" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.308477 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerName="extract" Mar 20 15:38:04 crc kubenswrapper[4779]: E0320 15:38:04.308500 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerName="util" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.308509 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerName="util" Mar 20 15:38:04 crc kubenswrapper[4779]: E0320 15:38:04.308519 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerName="pull" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.308526 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerName="pull" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.308638 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bffdf90-d3df-4cfb-8484-a70e0c418f5c" containerName="extract" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.308652 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49b764a-39a5-4961-aa89-4f99c8b155a4" containerName="oc" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.309101 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.311169 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-z4vtt" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.311185 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.311542 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.318472 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz"] Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.451220 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzwmn\" (UniqueName: \"kubernetes.io/projected/504ba887-92f4-4a4f-b44c-c416944075fe-kube-api-access-pzwmn\") pod \"nmstate-operator-796d4cfff4-s5ctz\" (UID: \"504ba887-92f4-4a4f-b44c-c416944075fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.552910 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzwmn\" (UniqueName: \"kubernetes.io/projected/504ba887-92f4-4a4f-b44c-c416944075fe-kube-api-access-pzwmn\") pod \"nmstate-operator-796d4cfff4-s5ctz\" (UID: \"504ba887-92f4-4a4f-b44c-c416944075fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.566753 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567018-k964v" event={"ID":"a49b764a-39a5-4961-aa89-4f99c8b155a4","Type":"ContainerDied","Data":"960477a05985d7d74a5ae6d2ffc10a681ee3ac474ea8415e82b8a3eafccadf64"} Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.566794 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="960477a05985d7d74a5ae6d2ffc10a681ee3ac474ea8415e82b8a3eafccadf64" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.566805 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567018-k964v" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.568922 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzwmn\" (UniqueName: \"kubernetes.io/projected/504ba887-92f4-4a4f-b44c-c416944075fe-kube-api-access-pzwmn\") pod \"nmstate-operator-796d4cfff4-s5ctz\" (UID: \"504ba887-92f4-4a4f-b44c-c416944075fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.625702 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz" Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.840856 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567012-hh498"] Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.843460 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567012-hh498"] Mar 20 15:38:04 crc kubenswrapper[4779]: I0320 15:38:04.869745 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz"] Mar 20 15:38:05 crc kubenswrapper[4779]: I0320 15:38:05.573777 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz" event={"ID":"504ba887-92f4-4a4f-b44c-c416944075fe","Type":"ContainerStarted","Data":"88420b6a4fa760260b08c15f376643b5ff4eb2ee058469691e04bb64008c52c6"} Mar 20 15:38:05 crc kubenswrapper[4779]: I0320 15:38:05.818127 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb42fe5-65a0-4287-a631-c15e961ce57c" path="/var/lib/kubelet/pods/6fb42fe5-65a0-4287-a631-c15e961ce57c/volumes" Mar 20 15:38:08 crc kubenswrapper[4779]: I0320 15:38:08.596242 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz" event={"ID":"504ba887-92f4-4a4f-b44c-c416944075fe","Type":"ContainerStarted","Data":"ce016b79e8b60e4fe9bc739d197869dc03561b6b60d2c4e5dd7e4f19c5de4479"} Mar 20 15:38:08 crc kubenswrapper[4779]: I0320 15:38:08.613242 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s5ctz" podStartSLOduration=1.8622954250000001 podStartE2EDuration="4.613221371s" podCreationTimestamp="2026-03-20 15:38:04 +0000 UTC" firstStartedPulling="2026-03-20 15:38:04.87583908 +0000 UTC m=+901.838354880" lastFinishedPulling="2026-03-20 15:38:07.626765026 +0000 UTC m=+904.589280826" observedRunningTime="2026-03-20 15:38:08.610190484 +0000 UTC m=+905.572706284" watchObservedRunningTime="2026-03-20 15:38:08.613221371 +0000 UTC m=+905.575737171" Mar 20 15:38:19 crc kubenswrapper[4779]: I0320 15:38:19.051926 4779 scope.go:117] "RemoveContainer" containerID="668be50bc3945e74c969f7c7f93df91c9773291d395ea440647525a2bbc791b2" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.430693 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.436363 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.439629 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vkkm4" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.453563 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.454721 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.458433 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.485169 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bg6f8"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.486028 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.488719 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.493671 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.559996 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.560756 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.563386 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.563472 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.563549 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-b9bqs" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.571888 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.629685 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5l47\" (UniqueName: \"kubernetes.io/projected/f630bb2c-91d5-434f-9a06-b933086a65fc-kube-api-access-z5l47\") pod \"nmstate-webhook-5f558f5558-nq7c2\" (UID: \"f630bb2c-91d5-434f-9a06-b933086a65fc\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.629752 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxcv\" (UniqueName: \"kubernetes.io/projected/35df524e-f5d4-4360-93e1-d76dde302069-kube-api-access-hxxcv\") pod \"nmstate-metrics-9b8c8685d-x8nlg\" (UID: \"35df524e-f5d4-4360-93e1-d76dde302069\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.629870 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f630bb2c-91d5-434f-9a06-b933086a65fc-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nq7c2\" (UID: \"f630bb2c-91d5-434f-9a06-b933086a65fc\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.629898 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-dbus-socket\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.629941 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-ovs-socket\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.629961 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-nmstate-lock\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.629982 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt67j\" (UniqueName: \"kubernetes.io/projected/436fb992-2d18-438e-b2e9-489bb530c9ba-kube-api-access-rt67j\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731547 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731598 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnp2n\" (UniqueName: \"kubernetes.io/projected/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-kube-api-access-pnp2n\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731679 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxcv\" (UniqueName: \"kubernetes.io/projected/35df524e-f5d4-4360-93e1-d76dde302069-kube-api-access-hxxcv\") pod \"nmstate-metrics-9b8c8685d-x8nlg\" (UID: \"35df524e-f5d4-4360-93e1-d76dde302069\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731740 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f630bb2c-91d5-434f-9a06-b933086a65fc-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nq7c2\" (UID: \"f630bb2c-91d5-434f-9a06-b933086a65fc\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731765 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-dbus-socket\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731797 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731827 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-ovs-socket\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731852 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-nmstate-lock\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731879 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt67j\" (UniqueName: \"kubernetes.io/projected/436fb992-2d18-438e-b2e9-489bb530c9ba-kube-api-access-rt67j\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731903 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5l47\" (UniqueName: \"kubernetes.io/projected/f630bb2c-91d5-434f-9a06-b933086a65fc-kube-api-access-z5l47\") pod \"nmstate-webhook-5f558f5558-nq7c2\" (UID: \"f630bb2c-91d5-434f-9a06-b933086a65fc\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731924 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-ovs-socket\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.731966 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-nmstate-lock\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.733327 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/436fb992-2d18-438e-b2e9-489bb530c9ba-dbus-socket\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.741327 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f630bb2c-91d5-434f-9a06-b933086a65fc-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nq7c2\" (UID: \"f630bb2c-91d5-434f-9a06-b933086a65fc\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.764799 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt67j\" (UniqueName: \"kubernetes.io/projected/436fb992-2d18-438e-b2e9-489bb530c9ba-kube-api-access-rt67j\") pod \"nmstate-handler-bg6f8\" (UID: \"436fb992-2d18-438e-b2e9-489bb530c9ba\") " pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.769009 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxcv\" (UniqueName: \"kubernetes.io/projected/35df524e-f5d4-4360-93e1-d76dde302069-kube-api-access-hxxcv\") pod \"nmstate-metrics-9b8c8685d-x8nlg\" (UID: \"35df524e-f5d4-4360-93e1-d76dde302069\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.779960 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5l47\" (UniqueName: \"kubernetes.io/projected/f630bb2c-91d5-434f-9a06-b933086a65fc-kube-api-access-z5l47\") pod \"nmstate-webhook-5f558f5558-nq7c2\" (UID: \"f630bb2c-91d5-434f-9a06-b933086a65fc\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.803755 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-84b9796764-58x7v"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.810840 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.812759 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84b9796764-58x7v"] Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.813139 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.833661 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.833730 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.833750 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnp2n\" (UniqueName: \"kubernetes.io/projected/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-kube-api-access-pnp2n\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.835225 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.839949 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.868529 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnp2n\" (UniqueName: \"kubernetes.io/projected/03f98c5f-95c6-4714-a883-76c1eb8ff7f1-kube-api-access-pnp2n\") pod \"nmstate-console-plugin-86f58fcf4-zjps6\" (UID: \"03f98c5f-95c6-4714-a883-76c1eb8ff7f1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.878556 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.934845 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-oauth-config\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.934876 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-service-ca\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.935741 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8v8d\" (UniqueName: \"kubernetes.io/projected/47c22c29-5982-4622-b24a-2d8a8ff5bec4-kube-api-access-v8v8d\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.936210 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-oauth-serving-cert\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.936616 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-config\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.936646 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-trusted-ca-bundle\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:35 crc kubenswrapper[4779]: I0320 15:38:35.936842 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-serving-cert\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.037489 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-oauth-config\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.037538 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-service-ca\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.037577 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8v8d\" (UniqueName: \"kubernetes.io/projected/47c22c29-5982-4622-b24a-2d8a8ff5bec4-kube-api-access-v8v8d\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.037606 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-oauth-serving-cert\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.037631 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-config\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.037665 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-trusted-ca-bundle\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.037693 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-serving-cert\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.039145 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-oauth-serving-cert\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.039660 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-service-ca\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.039879 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-config\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.040810 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47c22c29-5982-4622-b24a-2d8a8ff5bec4-trusted-ca-bundle\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.041707 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-oauth-config\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.042075 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47c22c29-5982-4622-b24a-2d8a8ff5bec4-console-serving-cert\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.054242 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8v8d\" (UniqueName: \"kubernetes.io/projected/47c22c29-5982-4622-b24a-2d8a8ff5bec4-kube-api-access-v8v8d\") pod \"console-84b9796764-58x7v\" (UID: \"47c22c29-5982-4622-b24a-2d8a8ff5bec4\") " pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.057957 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.081600 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.107551 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6"] Mar 20 15:38:36 crc kubenswrapper[4779]: W0320 15:38:36.110014 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03f98c5f_95c6_4714_a883_76c1eb8ff7f1.slice/crio-5af1562a45c199fea90a1648685f502fd37c0ce773ef2a7722878231f413cbac WatchSource:0}: Error finding container 5af1562a45c199fea90a1648685f502fd37c0ce773ef2a7722878231f413cbac: Status 404 returned error can't find the container with id 5af1562a45c199fea90a1648685f502fd37c0ce773ef2a7722878231f413cbac Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.205609 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.278502 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg"] Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.355279 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2"] Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.625926 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84b9796764-58x7v"] Mar 20 15:38:36 crc kubenswrapper[4779]: W0320 15:38:36.633584 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47c22c29_5982_4622_b24a_2d8a8ff5bec4.slice/crio-333e329794069a40aebb2213c0e6db02e6ebc880321abf08eaa7c174d5f652ca WatchSource:0}: Error finding container 333e329794069a40aebb2213c0e6db02e6ebc880321abf08eaa7c174d5f652ca: Status 404 returned error can't find the container with id 333e329794069a40aebb2213c0e6db02e6ebc880321abf08eaa7c174d5f652ca Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.763729 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" event={"ID":"35df524e-f5d4-4360-93e1-d76dde302069","Type":"ContainerStarted","Data":"a98a8121db7b2a34105f4cad93a4161b8d790a829c7188f75581ccbd58ff076f"} Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.765221 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b9796764-58x7v" event={"ID":"47c22c29-5982-4622-b24a-2d8a8ff5bec4","Type":"ContainerStarted","Data":"333e329794069a40aebb2213c0e6db02e6ebc880321abf08eaa7c174d5f652ca"} Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.767243 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" event={"ID":"03f98c5f-95c6-4714-a883-76c1eb8ff7f1","Type":"ContainerStarted","Data":"5af1562a45c199fea90a1648685f502fd37c0ce773ef2a7722878231f413cbac"} Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.769044 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bg6f8" event={"ID":"436fb992-2d18-438e-b2e9-489bb530c9ba","Type":"ContainerStarted","Data":"9d2401ee8cc52764a11e80808b3b00cbc5ce999e82b80af87e1bdba20755a01a"} Mar 20 15:38:36 crc kubenswrapper[4779]: I0320 15:38:36.772733 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" event={"ID":"f630bb2c-91d5-434f-9a06-b933086a65fc","Type":"ContainerStarted","Data":"91d9d8fb6ddf7ae0989cd074c206309433b35da46409af123e1abb0d42a15b2a"} Mar 20 15:38:37 crc kubenswrapper[4779]: I0320 15:38:37.780837 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b9796764-58x7v" event={"ID":"47c22c29-5982-4622-b24a-2d8a8ff5bec4","Type":"ContainerStarted","Data":"7a9b9e2df08930cdeb2752d8b265097b04c1bf9c2f8702f71511b0e7e0292b9a"} Mar 20 15:38:37 crc kubenswrapper[4779]: I0320 15:38:37.798248 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84b9796764-58x7v" podStartSLOduration=2.798224703 podStartE2EDuration="2.798224703s" podCreationTimestamp="2026-03-20 15:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:38:37.797642158 +0000 UTC m=+934.760157958" watchObservedRunningTime="2026-03-20 15:38:37.798224703 +0000 UTC m=+934.760740513" Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.792455 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" event={"ID":"f630bb2c-91d5-434f-9a06-b933086a65fc","Type":"ContainerStarted","Data":"db6f5132514b1560b3fdba5f55bb2055906ddf94c04d2cd0d3a1e84b48516466"} Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.792884 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.793551 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" event={"ID":"35df524e-f5d4-4360-93e1-d76dde302069","Type":"ContainerStarted","Data":"43d9ca28e80159cbfd5918446c3c544b8bf439a66d45b62242d22ebb8fd2db93"} Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.797347 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" event={"ID":"03f98c5f-95c6-4714-a883-76c1eb8ff7f1","Type":"ContainerStarted","Data":"815f868db15fa1c6528f6675d36896be1d6da22dd8e991d7ebbd5e500edd3755"} Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.799142 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bg6f8" event={"ID":"436fb992-2d18-438e-b2e9-489bb530c9ba","Type":"ContainerStarted","Data":"e87ff8948a2ae4f800725cf6623953d7a44058926c0037f6c5271b9e7b994b31"} Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.799222 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.812620 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" podStartSLOduration=2.34439573 podStartE2EDuration="4.812602619s" podCreationTimestamp="2026-03-20 15:38:35 +0000 UTC" firstStartedPulling="2026-03-20 15:38:36.356633839 +0000 UTC m=+933.319149639" lastFinishedPulling="2026-03-20 15:38:38.824840728 +0000 UTC m=+935.787356528" observedRunningTime="2026-03-20 15:38:39.8122324 +0000 UTC m=+936.774748200" watchObservedRunningTime="2026-03-20 15:38:39.812602619 +0000 UTC m=+936.775118419" Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.847941 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bg6f8" podStartSLOduration=1.893634824 podStartE2EDuration="4.847920244s" podCreationTimestamp="2026-03-20 15:38:35 +0000 UTC" firstStartedPulling="2026-03-20 15:38:35.857079983 +0000 UTC m=+932.819595783" lastFinishedPulling="2026-03-20 15:38:38.811365403 +0000 UTC m=+935.773881203" observedRunningTime="2026-03-20 15:38:39.828330939 +0000 UTC m=+936.790846739" watchObservedRunningTime="2026-03-20 15:38:39.847920244 +0000 UTC m=+936.810436044" Mar 20 15:38:39 crc kubenswrapper[4779]: I0320 15:38:39.851673 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zjps6" podStartSLOduration=2.154078332 podStartE2EDuration="4.851605646s" podCreationTimestamp="2026-03-20 15:38:35 +0000 UTC" firstStartedPulling="2026-03-20 15:38:36.111886981 +0000 UTC m=+933.074402781" lastFinishedPulling="2026-03-20 15:38:38.809414295 +0000 UTC m=+935.771930095" observedRunningTime="2026-03-20 15:38:39.847240298 +0000 UTC m=+936.809756098" watchObservedRunningTime="2026-03-20 15:38:39.851605646 +0000 UTC m=+936.814121446" Mar 20 15:38:41 crc kubenswrapper[4779]: I0320 15:38:41.815695 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" event={"ID":"35df524e-f5d4-4360-93e1-d76dde302069","Type":"ContainerStarted","Data":"b0e2460771052aeb59963cee9445d85fd33d531620172a1340fc3a9b3fb49645"} Mar 20 15:38:41 crc kubenswrapper[4779]: I0320 15:38:41.831830 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-x8nlg" podStartSLOduration=1.852020692 podStartE2EDuration="6.831807864s" podCreationTimestamp="2026-03-20 15:38:35 +0000 UTC" firstStartedPulling="2026-03-20 15:38:36.313011208 +0000 UTC m=+933.275527008" lastFinishedPulling="2026-03-20 15:38:41.29279838 +0000 UTC m=+938.255314180" observedRunningTime="2026-03-20 15:38:41.827876548 +0000 UTC m=+938.790392348" watchObservedRunningTime="2026-03-20 15:38:41.831807864 +0000 UTC m=+938.794323684" Mar 20 15:38:45 crc kubenswrapper[4779]: I0320 15:38:45.832024 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bg6f8" Mar 20 15:38:46 crc kubenswrapper[4779]: I0320 15:38:46.206427 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:46 crc kubenswrapper[4779]: I0320 15:38:46.206497 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:46 crc kubenswrapper[4779]: I0320 15:38:46.215240 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:46 crc kubenswrapper[4779]: I0320 15:38:46.840086 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84b9796764-58x7v" Mar 20 15:38:46 crc kubenswrapper[4779]: I0320 15:38:46.885878 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sl2tf"] Mar 20 15:38:56 crc kubenswrapper[4779]: I0320 15:38:56.087992 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq7c2" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.038122 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk"] Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.039679 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.041467 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.048076 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk"] Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.165782 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.165875 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzthk\" (UniqueName: \"kubernetes.io/projected/fae6f408-d45b-402b-9956-d903a90a49ac-kube-api-access-bzthk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.166094 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.267433 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.267491 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.267573 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzthk\" (UniqueName: \"kubernetes.io/projected/fae6f408-d45b-402b-9956-d903a90a49ac-kube-api-access-bzthk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.267892 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.268298 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.287235 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzthk\" (UniqueName: \"kubernetes.io/projected/fae6f408-d45b-402b-9956-d903a90a49ac-kube-api-access-bzthk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.354277 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.750888 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk"] Mar 20 15:39:08 crc kubenswrapper[4779]: W0320 15:39:08.759712 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae6f408_d45b_402b_9956_d903a90a49ac.slice/crio-5c574863a32d185bdfcaacd2cd9d9fbdd50cea9356892bae4eac42e78a79f545 WatchSource:0}: Error finding container 5c574863a32d185bdfcaacd2cd9d9fbdd50cea9356892bae4eac42e78a79f545: Status 404 returned error can't find the container with id 5c574863a32d185bdfcaacd2cd9d9fbdd50cea9356892bae4eac42e78a79f545 Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.974469 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" event={"ID":"fae6f408-d45b-402b-9956-d903a90a49ac","Type":"ContainerStarted","Data":"f2da0644acf2119fa225fdd87ed1721bacf7b02824da70bf9e511cd4105b7278"} Mar 20 15:39:08 crc kubenswrapper[4779]: I0320 15:39:08.974838 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" event={"ID":"fae6f408-d45b-402b-9956-d903a90a49ac","Type":"ContainerStarted","Data":"5c574863a32d185bdfcaacd2cd9d9fbdd50cea9356892bae4eac42e78a79f545"} Mar 20 15:39:09 crc kubenswrapper[4779]: I0320 15:39:09.980801 4779 generic.go:334] "Generic (PLEG): container finished" podID="fae6f408-d45b-402b-9956-d903a90a49ac" containerID="f2da0644acf2119fa225fdd87ed1721bacf7b02824da70bf9e511cd4105b7278" exitCode=0 Mar 20 15:39:09 crc kubenswrapper[4779]: I0320 15:39:09.980850 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" event={"ID":"fae6f408-d45b-402b-9956-d903a90a49ac","Type":"ContainerDied","Data":"f2da0644acf2119fa225fdd87ed1721bacf7b02824da70bf9e511cd4105b7278"} Mar 20 15:39:11 crc kubenswrapper[4779]: I0320 15:39:11.925534 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-sl2tf" podUID="163fdaa3-a29a-44bb-9da0-97b18da1c2ba" containerName="console" containerID="cri-o://9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2" gracePeriod=15 Mar 20 15:39:11 crc kubenswrapper[4779]: I0320 15:39:11.992856 4779 generic.go:334] "Generic (PLEG): container finished" podID="fae6f408-d45b-402b-9956-d903a90a49ac" containerID="1cdb0dcffbba8b80338d2eada14eeb7321d60993aa264a28f182c72220b70c1b" exitCode=0 Mar 20 15:39:11 crc kubenswrapper[4779]: I0320 15:39:11.992910 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" event={"ID":"fae6f408-d45b-402b-9956-d903a90a49ac","Type":"ContainerDied","Data":"1cdb0dcffbba8b80338d2eada14eeb7321d60993aa264a28f182c72220b70c1b"} Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.352942 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sl2tf_163fdaa3-a29a-44bb-9da0-97b18da1c2ba/console/0.log" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.353013 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.517380 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb6v4\" (UniqueName: \"kubernetes.io/projected/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-kube-api-access-pb6v4\") pod \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.517443 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-service-ca\") pod \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.517470 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-serving-cert\") pod \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.517498 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-trusted-ca-bundle\") pod \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.517545 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-oauth-serving-cert\") pod \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.517560 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-config\") pod \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.517588 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-oauth-config\") pod \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\" (UID: \"163fdaa3-a29a-44bb-9da0-97b18da1c2ba\") " Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.518677 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-service-ca" (OuterVolumeSpecName: "service-ca") pod "163fdaa3-a29a-44bb-9da0-97b18da1c2ba" (UID: "163fdaa3-a29a-44bb-9da0-97b18da1c2ba"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.518686 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "163fdaa3-a29a-44bb-9da0-97b18da1c2ba" (UID: "163fdaa3-a29a-44bb-9da0-97b18da1c2ba"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.518903 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-config" (OuterVolumeSpecName: "console-config") pod "163fdaa3-a29a-44bb-9da0-97b18da1c2ba" (UID: "163fdaa3-a29a-44bb-9da0-97b18da1c2ba"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.518916 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "163fdaa3-a29a-44bb-9da0-97b18da1c2ba" (UID: "163fdaa3-a29a-44bb-9da0-97b18da1c2ba"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.523251 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-kube-api-access-pb6v4" (OuterVolumeSpecName: "kube-api-access-pb6v4") pod "163fdaa3-a29a-44bb-9da0-97b18da1c2ba" (UID: "163fdaa3-a29a-44bb-9da0-97b18da1c2ba"). InnerVolumeSpecName "kube-api-access-pb6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.523258 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "163fdaa3-a29a-44bb-9da0-97b18da1c2ba" (UID: "163fdaa3-a29a-44bb-9da0-97b18da1c2ba"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.523514 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "163fdaa3-a29a-44bb-9da0-97b18da1c2ba" (UID: "163fdaa3-a29a-44bb-9da0-97b18da1c2ba"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.619077 4779 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.619128 4779 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.619140 4779 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.619148 4779 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.619158 4779 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.619166 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb6v4\" (UniqueName: \"kubernetes.io/projected/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-kube-api-access-pb6v4\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:12 crc kubenswrapper[4779]: I0320 15:39:12.619175 4779 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/163fdaa3-a29a-44bb-9da0-97b18da1c2ba-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.000383 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sl2tf_163fdaa3-a29a-44bb-9da0-97b18da1c2ba/console/0.log" Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.000853 4779 generic.go:334] "Generic (PLEG): container finished" podID="163fdaa3-a29a-44bb-9da0-97b18da1c2ba" containerID="9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2" exitCode=2 Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.000956 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sl2tf" Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.000964 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sl2tf" event={"ID":"163fdaa3-a29a-44bb-9da0-97b18da1c2ba","Type":"ContainerDied","Data":"9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2"} Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.001173 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sl2tf" event={"ID":"163fdaa3-a29a-44bb-9da0-97b18da1c2ba","Type":"ContainerDied","Data":"30e532bbacc4e9ca0196218949fb082b6b37f767de7a8f801f255653b206668d"} Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.001203 4779 scope.go:117] "RemoveContainer" containerID="9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2" Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.004051 4779 generic.go:334] "Generic (PLEG): container finished" podID="fae6f408-d45b-402b-9956-d903a90a49ac" containerID="282dd079f30eb3dc92307c1d2da5308c7262973bcfc71b1fa97bc33c6c55eef2" exitCode=0 Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.004097 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" event={"ID":"fae6f408-d45b-402b-9956-d903a90a49ac","Type":"ContainerDied","Data":"282dd079f30eb3dc92307c1d2da5308c7262973bcfc71b1fa97bc33c6c55eef2"} Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.021087 4779 scope.go:117] "RemoveContainer" containerID="9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2" Mar 20 15:39:13 crc kubenswrapper[4779]: E0320 15:39:13.021539 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2\": container with ID starting with 9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2 not found: ID does not exist" containerID="9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2" Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.021579 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2"} err="failed to get container status \"9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2\": rpc error: code = NotFound desc = could not find container \"9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2\": container with ID starting with 9d453b3b112f7d0d00ad6295298a2d86b52443d309d759dc573ae6dcbaf78dd2 not found: ID does not exist" Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.053343 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sl2tf"] Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.058608 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-sl2tf"] Mar 20 15:39:13 crc kubenswrapper[4779]: I0320 15:39:13.816840 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="163fdaa3-a29a-44bb-9da0-97b18da1c2ba" path="/var/lib/kubelet/pods/163fdaa3-a29a-44bb-9da0-97b18da1c2ba/volumes" Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.221884 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.354970 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-util\") pod \"fae6f408-d45b-402b-9956-d903a90a49ac\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.355073 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-bundle\") pod \"fae6f408-d45b-402b-9956-d903a90a49ac\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.355098 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzthk\" (UniqueName: \"kubernetes.io/projected/fae6f408-d45b-402b-9956-d903a90a49ac-kube-api-access-bzthk\") pod \"fae6f408-d45b-402b-9956-d903a90a49ac\" (UID: \"fae6f408-d45b-402b-9956-d903a90a49ac\") " Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.356239 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-bundle" (OuterVolumeSpecName: "bundle") pod "fae6f408-d45b-402b-9956-d903a90a49ac" (UID: "fae6f408-d45b-402b-9956-d903a90a49ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.360148 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae6f408-d45b-402b-9956-d903a90a49ac-kube-api-access-bzthk" (OuterVolumeSpecName: "kube-api-access-bzthk") pod "fae6f408-d45b-402b-9956-d903a90a49ac" (UID: "fae6f408-d45b-402b-9956-d903a90a49ac"). InnerVolumeSpecName "kube-api-access-bzthk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.397004 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-util" (OuterVolumeSpecName: "util") pod "fae6f408-d45b-402b-9956-d903a90a49ac" (UID: "fae6f408-d45b-402b-9956-d903a90a49ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.456024 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzthk\" (UniqueName: \"kubernetes.io/projected/fae6f408-d45b-402b-9956-d903a90a49ac-kube-api-access-bzthk\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.456057 4779 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:14 crc kubenswrapper[4779]: I0320 15:39:14.456070 4779 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fae6f408-d45b-402b-9956-d903a90a49ac-util\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:15 crc kubenswrapper[4779]: I0320 15:39:15.019246 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" event={"ID":"fae6f408-d45b-402b-9956-d903a90a49ac","Type":"ContainerDied","Data":"5c574863a32d185bdfcaacd2cd9d9fbdd50cea9356892bae4eac42e78a79f545"} Mar 20 15:39:15 crc kubenswrapper[4779]: I0320 15:39:15.019615 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c574863a32d185bdfcaacd2cd9d9fbdd50cea9356892bae4eac42e78a79f545" Mar 20 15:39:15 crc kubenswrapper[4779]: I0320 15:39:15.019306 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.197655 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc"] Mar 20 15:39:23 crc kubenswrapper[4779]: E0320 15:39:23.198405 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae6f408-d45b-402b-9956-d903a90a49ac" containerName="util" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.198421 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae6f408-d45b-402b-9956-d903a90a49ac" containerName="util" Mar 20 15:39:23 crc kubenswrapper[4779]: E0320 15:39:23.198434 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae6f408-d45b-402b-9956-d903a90a49ac" containerName="pull" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.198442 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae6f408-d45b-402b-9956-d903a90a49ac" containerName="pull" Mar 20 15:39:23 crc kubenswrapper[4779]: E0320 15:39:23.198458 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="163fdaa3-a29a-44bb-9da0-97b18da1c2ba" containerName="console" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.198467 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="163fdaa3-a29a-44bb-9da0-97b18da1c2ba" containerName="console" Mar 20 15:39:23 crc kubenswrapper[4779]: E0320 15:39:23.198477 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae6f408-d45b-402b-9956-d903a90a49ac" containerName="extract" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.198484 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae6f408-d45b-402b-9956-d903a90a49ac" containerName="extract" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.198606 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="163fdaa3-a29a-44bb-9da0-97b18da1c2ba" containerName="console" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.198618 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae6f408-d45b-402b-9956-d903a90a49ac" containerName="extract" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.199033 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.205861 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.205914 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.206190 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.206237 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.206941 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-gwkps" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.231499 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc"] Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.262307 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzvzt\" (UniqueName: \"kubernetes.io/projected/174f21d6-ef49-44fa-934c-f9823ffcc511-kube-api-access-tzvzt\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.262669 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/174f21d6-ef49-44fa-934c-f9823ffcc511-apiservice-cert\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.262754 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/174f21d6-ef49-44fa-934c-f9823ffcc511-webhook-cert\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.364047 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/174f21d6-ef49-44fa-934c-f9823ffcc511-webhook-cert\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.364154 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzvzt\" (UniqueName: \"kubernetes.io/projected/174f21d6-ef49-44fa-934c-f9823ffcc511-kube-api-access-tzvzt\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.364193 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/174f21d6-ef49-44fa-934c-f9823ffcc511-apiservice-cert\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.370309 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/174f21d6-ef49-44fa-934c-f9823ffcc511-apiservice-cert\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.370657 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/174f21d6-ef49-44fa-934c-f9823ffcc511-webhook-cert\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.383936 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzvzt\" (UniqueName: \"kubernetes.io/projected/174f21d6-ef49-44fa-934c-f9823ffcc511-kube-api-access-tzvzt\") pod \"metallb-operator-controller-manager-bdf54d65-pvrsc\" (UID: \"174f21d6-ef49-44fa-934c-f9823ffcc511\") " pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.434329 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw"] Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.435282 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.437080 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pd4c4" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.437279 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.437446 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.466635 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-webhook-cert\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.466718 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-apiservice-cert\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.466757 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfxzn\" (UniqueName: \"kubernetes.io/projected/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-kube-api-access-zfxzn\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.486559 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw"] Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.533209 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.567561 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-webhook-cert\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.567661 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-apiservice-cert\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.567709 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfxzn\" (UniqueName: \"kubernetes.io/projected/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-kube-api-access-zfxzn\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.572286 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-apiservice-cert\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.577997 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-webhook-cert\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.585823 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfxzn\" (UniqueName: \"kubernetes.io/projected/70059cf3-c8a9-4446-a88d-070f8e7bdf8d-kube-api-access-zfxzn\") pod \"metallb-operator-webhook-server-5d49844ddb-kx4nw\" (UID: \"70059cf3-c8a9-4446-a88d-070f8e7bdf8d\") " pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.751494 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:23 crc kubenswrapper[4779]: I0320 15:39:23.796406 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc"] Mar 20 15:39:24 crc kubenswrapper[4779]: I0320 15:39:24.068224 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" event={"ID":"174f21d6-ef49-44fa-934c-f9823ffcc511","Type":"ContainerStarted","Data":"e490dadc7b191ab10c43a23d5562fc35134a2cab075088021c2abca3d2a1bf52"} Mar 20 15:39:24 crc kubenswrapper[4779]: I0320 15:39:24.213503 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw"] Mar 20 15:39:25 crc kubenswrapper[4779]: I0320 15:39:25.075269 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" event={"ID":"70059cf3-c8a9-4446-a88d-070f8e7bdf8d","Type":"ContainerStarted","Data":"9d91396cb0c120a5e304cfb3c1e68174c892e5c270d7b9c9e0f421bf0ec68b1f"} Mar 20 15:39:27 crc kubenswrapper[4779]: I0320 15:39:27.092421 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" event={"ID":"174f21d6-ef49-44fa-934c-f9823ffcc511","Type":"ContainerStarted","Data":"4cbbc8583ac19a7e79ead49a205384589835aff886bbd22d3195a31fdbbf3e68"} Mar 20 15:39:27 crc kubenswrapper[4779]: I0320 15:39:27.092968 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:39:27 crc kubenswrapper[4779]: I0320 15:39:27.124308 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" podStartSLOduration=1.12934186 podStartE2EDuration="4.124291945s" podCreationTimestamp="2026-03-20 15:39:23 +0000 UTC" firstStartedPulling="2026-03-20 15:39:23.813169503 +0000 UTC m=+980.775685303" lastFinishedPulling="2026-03-20 15:39:26.808119588 +0000 UTC m=+983.770635388" observedRunningTime="2026-03-20 15:39:27.121823113 +0000 UTC m=+984.084338913" watchObservedRunningTime="2026-03-20 15:39:27.124291945 +0000 UTC m=+984.086807745" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.799608 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vhcg6"] Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.801191 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.816312 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vhcg6"] Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.888594 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-catalog-content\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.888644 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8p89\" (UniqueName: \"kubernetes.io/projected/bfb83af5-2bb1-42b5-b97e-4b4028217c05-kube-api-access-z8p89\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.888680 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-utilities\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.990089 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8p89\" (UniqueName: \"kubernetes.io/projected/bfb83af5-2bb1-42b5-b97e-4b4028217c05-kube-api-access-z8p89\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.990180 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-utilities\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.990279 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-catalog-content\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.990634 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-utilities\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:29 crc kubenswrapper[4779]: I0320 15:39:29.990694 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-catalog-content\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:30 crc kubenswrapper[4779]: I0320 15:39:30.011488 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8p89\" (UniqueName: \"kubernetes.io/projected/bfb83af5-2bb1-42b5-b97e-4b4028217c05-kube-api-access-z8p89\") pod \"community-operators-vhcg6\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:30 crc kubenswrapper[4779]: I0320 15:39:30.110995 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" event={"ID":"70059cf3-c8a9-4446-a88d-070f8e7bdf8d","Type":"ContainerStarted","Data":"0ed4a0b3e9d7d6b048d17073ed7c43e003b4fdc962011e7450b0cb3c138cfee2"} Mar 20 15:39:30 crc kubenswrapper[4779]: I0320 15:39:30.111897 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:30 crc kubenswrapper[4779]: I0320 15:39:30.123921 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:30 crc kubenswrapper[4779]: I0320 15:39:30.142766 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" podStartSLOduration=1.8246558240000001 podStartE2EDuration="7.142741537s" podCreationTimestamp="2026-03-20 15:39:23 +0000 UTC" firstStartedPulling="2026-03-20 15:39:24.224802661 +0000 UTC m=+981.187318461" lastFinishedPulling="2026-03-20 15:39:29.542888374 +0000 UTC m=+986.505404174" observedRunningTime="2026-03-20 15:39:30.135885706 +0000 UTC m=+987.098401506" watchObservedRunningTime="2026-03-20 15:39:30.142741537 +0000 UTC m=+987.105257357" Mar 20 15:39:30 crc kubenswrapper[4779]: I0320 15:39:30.403873 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vhcg6"] Mar 20 15:39:31 crc kubenswrapper[4779]: I0320 15:39:31.117212 4779 generic.go:334] "Generic (PLEG): container finished" podID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerID="c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e" exitCode=0 Mar 20 15:39:31 crc kubenswrapper[4779]: I0320 15:39:31.117366 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhcg6" event={"ID":"bfb83af5-2bb1-42b5-b97e-4b4028217c05","Type":"ContainerDied","Data":"c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e"} Mar 20 15:39:31 crc kubenswrapper[4779]: I0320 15:39:31.117537 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhcg6" event={"ID":"bfb83af5-2bb1-42b5-b97e-4b4028217c05","Type":"ContainerStarted","Data":"437219fba50b8e7baed7b66a4ffb639fa1fcea10bf6ad9b84b54bc421aca1f9a"} Mar 20 15:39:32 crc kubenswrapper[4779]: I0320 15:39:32.136009 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhcg6" event={"ID":"bfb83af5-2bb1-42b5-b97e-4b4028217c05","Type":"ContainerStarted","Data":"1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165"} Mar 20 15:39:33 crc kubenswrapper[4779]: I0320 15:39:33.144008 4779 generic.go:334] "Generic (PLEG): container finished" podID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerID="1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165" exitCode=0 Mar 20 15:39:33 crc kubenswrapper[4779]: I0320 15:39:33.144069 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhcg6" event={"ID":"bfb83af5-2bb1-42b5-b97e-4b4028217c05","Type":"ContainerDied","Data":"1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165"} Mar 20 15:39:34 crc kubenswrapper[4779]: I0320 15:39:34.167033 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhcg6" event={"ID":"bfb83af5-2bb1-42b5-b97e-4b4028217c05","Type":"ContainerStarted","Data":"ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342"} Mar 20 15:39:40 crc kubenswrapper[4779]: I0320 15:39:40.124683 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:40 crc kubenswrapper[4779]: I0320 15:39:40.125236 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:40 crc kubenswrapper[4779]: I0320 15:39:40.166708 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:40 crc kubenswrapper[4779]: I0320 15:39:40.186314 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vhcg6" podStartSLOduration=8.758308889 podStartE2EDuration="11.186292413s" podCreationTimestamp="2026-03-20 15:39:29 +0000 UTC" firstStartedPulling="2026-03-20 15:39:31.118560129 +0000 UTC m=+988.081075929" lastFinishedPulling="2026-03-20 15:39:33.546543653 +0000 UTC m=+990.509059453" observedRunningTime="2026-03-20 15:39:34.191580272 +0000 UTC m=+991.154096072" watchObservedRunningTime="2026-03-20 15:39:40.186292413 +0000 UTC m=+997.148808203" Mar 20 15:39:40 crc kubenswrapper[4779]: I0320 15:39:40.237285 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:43 crc kubenswrapper[4779]: I0320 15:39:43.192461 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vhcg6"] Mar 20 15:39:43 crc kubenswrapper[4779]: I0320 15:39:43.193208 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vhcg6" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerName="registry-server" containerID="cri-o://ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342" gracePeriod=2 Mar 20 15:39:43 crc kubenswrapper[4779]: I0320 15:39:43.755603 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5d49844ddb-kx4nw" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.011404 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5fhvp"] Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.013260 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.024686 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5fhvp"] Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.091887 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcb47\" (UniqueName: \"kubernetes.io/projected/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-kube-api-access-gcb47\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.091948 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-catalog-content\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.092013 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-utilities\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.101704 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.194438 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-utilities\") pod \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.194523 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8p89\" (UniqueName: \"kubernetes.io/projected/bfb83af5-2bb1-42b5-b97e-4b4028217c05-kube-api-access-z8p89\") pod \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.194545 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-catalog-content\") pod \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\" (UID: \"bfb83af5-2bb1-42b5-b97e-4b4028217c05\") " Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.194764 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-utilities\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.194840 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-catalog-content\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.194857 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcb47\" (UniqueName: \"kubernetes.io/projected/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-kube-api-access-gcb47\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.195647 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-utilities" (OuterVolumeSpecName: "utilities") pod "bfb83af5-2bb1-42b5-b97e-4b4028217c05" (UID: "bfb83af5-2bb1-42b5-b97e-4b4028217c05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.195832 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-utilities\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.197523 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-catalog-content\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.202167 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb83af5-2bb1-42b5-b97e-4b4028217c05-kube-api-access-z8p89" (OuterVolumeSpecName: "kube-api-access-z8p89") pod "bfb83af5-2bb1-42b5-b97e-4b4028217c05" (UID: "bfb83af5-2bb1-42b5-b97e-4b4028217c05"). InnerVolumeSpecName "kube-api-access-z8p89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.215080 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcb47\" (UniqueName: \"kubernetes.io/projected/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-kube-api-access-gcb47\") pod \"certified-operators-5fhvp\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.229203 4779 generic.go:334] "Generic (PLEG): container finished" podID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerID="ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342" exitCode=0 Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.229253 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhcg6" event={"ID":"bfb83af5-2bb1-42b5-b97e-4b4028217c05","Type":"ContainerDied","Data":"ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342"} Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.229283 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhcg6" event={"ID":"bfb83af5-2bb1-42b5-b97e-4b4028217c05","Type":"ContainerDied","Data":"437219fba50b8e7baed7b66a4ffb639fa1fcea10bf6ad9b84b54bc421aca1f9a"} Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.229302 4779 scope.go:117] "RemoveContainer" containerID="ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.229423 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhcg6" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.251286 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfb83af5-2bb1-42b5-b97e-4b4028217c05" (UID: "bfb83af5-2bb1-42b5-b97e-4b4028217c05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.266369 4779 scope.go:117] "RemoveContainer" containerID="1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.296629 4779 scope.go:117] "RemoveContainer" containerID="c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.297656 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.297704 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8p89\" (UniqueName: \"kubernetes.io/projected/bfb83af5-2bb1-42b5-b97e-4b4028217c05-kube-api-access-z8p89\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.297719 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb83af5-2bb1-42b5-b97e-4b4028217c05-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.329262 4779 scope.go:117] "RemoveContainer" containerID="ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342" Mar 20 15:39:44 crc kubenswrapper[4779]: E0320 15:39:44.329652 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342\": container with ID starting with ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342 not found: ID does not exist" containerID="ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.329683 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342"} err="failed to get container status \"ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342\": rpc error: code = NotFound desc = could not find container \"ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342\": container with ID starting with ef5e8cc65612d115095abb7ff8a3845a44e686980e06190f4c3ef56ce8057342 not found: ID does not exist" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.329708 4779 scope.go:117] "RemoveContainer" containerID="1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165" Mar 20 15:39:44 crc kubenswrapper[4779]: E0320 15:39:44.330072 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165\": container with ID starting with 1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165 not found: ID does not exist" containerID="1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.330097 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165"} err="failed to get container status \"1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165\": rpc error: code = NotFound desc = could not find container \"1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165\": container with ID starting with 1db5b7bc612ffc0db9edb77d401fadc92ac1354b3557f7fa53aff211f1a27165 not found: ID does not exist" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.330130 4779 scope.go:117] "RemoveContainer" containerID="c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e" Mar 20 15:39:44 crc kubenswrapper[4779]: E0320 15:39:44.330592 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e\": container with ID starting with c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e not found: ID does not exist" containerID="c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.330624 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e"} err="failed to get container status \"c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e\": rpc error: code = NotFound desc = could not find container \"c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e\": container with ID starting with c93b70d0a2bec850f2bc3303cd59910129454ba6e0f0b63299fb81c85acdd76e not found: ID does not exist" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.337439 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.583252 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vhcg6"] Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.589103 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vhcg6"] Mar 20 15:39:44 crc kubenswrapper[4779]: I0320 15:39:44.617303 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5fhvp"] Mar 20 15:39:44 crc kubenswrapper[4779]: W0320 15:39:44.620883 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e3dde3_71d2_48dc_a532_6b0290d31fa7.slice/crio-80e55fdbd2e3d75f9ca5fada8042eec5c8f02dbfb41b40c10af671618b561e83 WatchSource:0}: Error finding container 80e55fdbd2e3d75f9ca5fada8042eec5c8f02dbfb41b40c10af671618b561e83: Status 404 returned error can't find the container with id 80e55fdbd2e3d75f9ca5fada8042eec5c8f02dbfb41b40c10af671618b561e83 Mar 20 15:39:45 crc kubenswrapper[4779]: I0320 15:39:45.236690 4779 generic.go:334] "Generic (PLEG): container finished" podID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerID="5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d" exitCode=0 Mar 20 15:39:45 crc kubenswrapper[4779]: I0320 15:39:45.236831 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fhvp" event={"ID":"b7e3dde3-71d2-48dc-a532-6b0290d31fa7","Type":"ContainerDied","Data":"5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d"} Mar 20 15:39:45 crc kubenswrapper[4779]: I0320 15:39:45.237911 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fhvp" event={"ID":"b7e3dde3-71d2-48dc-a532-6b0290d31fa7","Type":"ContainerStarted","Data":"80e55fdbd2e3d75f9ca5fada8042eec5c8f02dbfb41b40c10af671618b561e83"} Mar 20 15:39:45 crc kubenswrapper[4779]: I0320 15:39:45.816189 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" path="/var/lib/kubelet/pods/bfb83af5-2bb1-42b5-b97e-4b4028217c05/volumes" Mar 20 15:39:46 crc kubenswrapper[4779]: I0320 15:39:46.246671 4779 generic.go:334] "Generic (PLEG): container finished" podID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerID="2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04" exitCode=0 Mar 20 15:39:46 crc kubenswrapper[4779]: I0320 15:39:46.246801 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fhvp" event={"ID":"b7e3dde3-71d2-48dc-a532-6b0290d31fa7","Type":"ContainerDied","Data":"2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04"} Mar 20 15:39:47 crc kubenswrapper[4779]: I0320 15:39:47.255713 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fhvp" event={"ID":"b7e3dde3-71d2-48dc-a532-6b0290d31fa7","Type":"ContainerStarted","Data":"a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877"} Mar 20 15:39:47 crc kubenswrapper[4779]: I0320 15:39:47.272396 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5fhvp" podStartSLOduration=2.908386005 podStartE2EDuration="4.272380178s" podCreationTimestamp="2026-03-20 15:39:43 +0000 UTC" firstStartedPulling="2026-03-20 15:39:45.23829109 +0000 UTC m=+1002.200806890" lastFinishedPulling="2026-03-20 15:39:46.602285263 +0000 UTC m=+1003.564801063" observedRunningTime="2026-03-20 15:39:47.272350707 +0000 UTC m=+1004.234866507" watchObservedRunningTime="2026-03-20 15:39:47.272380178 +0000 UTC m=+1004.234895978" Mar 20 15:39:54 crc kubenswrapper[4779]: I0320 15:39:54.338172 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:54 crc kubenswrapper[4779]: I0320 15:39:54.338731 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:54 crc kubenswrapper[4779]: I0320 15:39:54.378227 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:55 crc kubenswrapper[4779]: I0320 15:39:55.149987 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:39:55 crc kubenswrapper[4779]: I0320 15:39:55.150271 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:39:55 crc kubenswrapper[4779]: I0320 15:39:55.333026 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:56 crc kubenswrapper[4779]: I0320 15:39:56.593214 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5fhvp"] Mar 20 15:39:57 crc kubenswrapper[4779]: I0320 15:39:57.306298 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5fhvp" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerName="registry-server" containerID="cri-o://a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877" gracePeriod=2 Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.160126 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.272011 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcb47\" (UniqueName: \"kubernetes.io/projected/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-kube-api-access-gcb47\") pod \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.272085 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-catalog-content\") pod \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.272223 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-utilities\") pod \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\" (UID: \"b7e3dde3-71d2-48dc-a532-6b0290d31fa7\") " Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.273021 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-utilities" (OuterVolumeSpecName: "utilities") pod "b7e3dde3-71d2-48dc-a532-6b0290d31fa7" (UID: "b7e3dde3-71d2-48dc-a532-6b0290d31fa7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.278286 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-kube-api-access-gcb47" (OuterVolumeSpecName: "kube-api-access-gcb47") pod "b7e3dde3-71d2-48dc-a532-6b0290d31fa7" (UID: "b7e3dde3-71d2-48dc-a532-6b0290d31fa7"). InnerVolumeSpecName "kube-api-access-gcb47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.313533 4779 generic.go:334] "Generic (PLEG): container finished" podID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerID="a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877" exitCode=0 Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.313571 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fhvp" event={"ID":"b7e3dde3-71d2-48dc-a532-6b0290d31fa7","Type":"ContainerDied","Data":"a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877"} Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.313597 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fhvp" event={"ID":"b7e3dde3-71d2-48dc-a532-6b0290d31fa7","Type":"ContainerDied","Data":"80e55fdbd2e3d75f9ca5fada8042eec5c8f02dbfb41b40c10af671618b561e83"} Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.313612 4779 scope.go:117] "RemoveContainer" containerID="a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.313715 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fhvp" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.320718 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7e3dde3-71d2-48dc-a532-6b0290d31fa7" (UID: "b7e3dde3-71d2-48dc-a532-6b0290d31fa7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.333481 4779 scope.go:117] "RemoveContainer" containerID="2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.349616 4779 scope.go:117] "RemoveContainer" containerID="5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.371871 4779 scope.go:117] "RemoveContainer" containerID="a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877" Mar 20 15:39:58 crc kubenswrapper[4779]: E0320 15:39:58.372788 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877\": container with ID starting with a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877 not found: ID does not exist" containerID="a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.372824 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877"} err="failed to get container status \"a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877\": rpc error: code = NotFound desc = could not find container \"a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877\": container with ID starting with a6c91b78b0ec98ecf00dc4a05c0647004cdfff6287bf2934d9eeaa040321e877 not found: ID does not exist" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.372844 4779 scope.go:117] "RemoveContainer" containerID="2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.373316 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.373351 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcb47\" (UniqueName: \"kubernetes.io/projected/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-kube-api-access-gcb47\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.373362 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e3dde3-71d2-48dc-a532-6b0290d31fa7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:39:58 crc kubenswrapper[4779]: E0320 15:39:58.373649 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04\": container with ID starting with 2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04 not found: ID does not exist" containerID="2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.373695 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04"} err="failed to get container status \"2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04\": rpc error: code = NotFound desc = could not find container \"2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04\": container with ID starting with 2bdedf26e9690612ca853ed37844c26f732813900e2668e652e296f2e0ca7d04 not found: ID does not exist" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.373725 4779 scope.go:117] "RemoveContainer" containerID="5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d" Mar 20 15:39:58 crc kubenswrapper[4779]: E0320 15:39:58.374028 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d\": container with ID starting with 5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d not found: ID does not exist" containerID="5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.374055 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d"} err="failed to get container status \"5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d\": rpc error: code = NotFound desc = could not find container \"5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d\": container with ID starting with 5f870119c97c1b051e8b11894f532d5d31bc497945276cf8b2013ddf19cab82d not found: ID does not exist" Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.639077 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5fhvp"] Mar 20 15:39:58 crc kubenswrapper[4779]: I0320 15:39:58.644329 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5fhvp"] Mar 20 15:39:59 crc kubenswrapper[4779]: I0320 15:39:59.815486 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" path="/var/lib/kubelet/pods/b7e3dde3-71d2-48dc-a532-6b0290d31fa7/volumes" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129129 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567020-n5xz4"] Mar 20 15:40:00 crc kubenswrapper[4779]: E0320 15:40:00.129355 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerName="extract-utilities" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129369 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerName="extract-utilities" Mar 20 15:40:00 crc kubenswrapper[4779]: E0320 15:40:00.129381 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129387 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4779]: E0320 15:40:00.129402 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerName="extract-content" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129410 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerName="extract-content" Mar 20 15:40:00 crc kubenswrapper[4779]: E0320 15:40:00.129420 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerName="extract-utilities" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129428 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerName="extract-utilities" Mar 20 15:40:00 crc kubenswrapper[4779]: E0320 15:40:00.129438 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129444 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4779]: E0320 15:40:00.129453 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerName="extract-content" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129459 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerName="extract-content" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129550 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e3dde3-71d2-48dc-a532-6b0290d31fa7" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129566 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb83af5-2bb1-42b5-b97e-4b4028217c05" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.129936 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567020-n5xz4" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.136540 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.136717 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.136865 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.137743 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567020-n5xz4"] Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.196285 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jg8d\" (UniqueName: \"kubernetes.io/projected/e34ee163-b675-44d0-bb6f-25c122afd2fc-kube-api-access-2jg8d\") pod \"auto-csr-approver-29567020-n5xz4\" (UID: \"e34ee163-b675-44d0-bb6f-25c122afd2fc\") " pod="openshift-infra/auto-csr-approver-29567020-n5xz4" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.297935 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jg8d\" (UniqueName: \"kubernetes.io/projected/e34ee163-b675-44d0-bb6f-25c122afd2fc-kube-api-access-2jg8d\") pod \"auto-csr-approver-29567020-n5xz4\" (UID: \"e34ee163-b675-44d0-bb6f-25c122afd2fc\") " pod="openshift-infra/auto-csr-approver-29567020-n5xz4" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.324162 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jg8d\" (UniqueName: \"kubernetes.io/projected/e34ee163-b675-44d0-bb6f-25c122afd2fc-kube-api-access-2jg8d\") pod \"auto-csr-approver-29567020-n5xz4\" (UID: \"e34ee163-b675-44d0-bb6f-25c122afd2fc\") " pod="openshift-infra/auto-csr-approver-29567020-n5xz4" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.447467 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567020-n5xz4" Mar 20 15:40:00 crc kubenswrapper[4779]: I0320 15:40:00.673190 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567020-n5xz4"] Mar 20 15:40:01 crc kubenswrapper[4779]: I0320 15:40:01.335428 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567020-n5xz4" event={"ID":"e34ee163-b675-44d0-bb6f-25c122afd2fc","Type":"ContainerStarted","Data":"68ee578411a90322e7429eeef82f3ee292f40d275e0a536db218ee200194cb84"} Mar 20 15:40:02 crc kubenswrapper[4779]: I0320 15:40:02.342970 4779 generic.go:334] "Generic (PLEG): container finished" podID="e34ee163-b675-44d0-bb6f-25c122afd2fc" containerID="429949b1be87c41cc93dc959fbb63b342cddf89af82c9cb86a7a4dd0323c8cc6" exitCode=0 Mar 20 15:40:02 crc kubenswrapper[4779]: I0320 15:40:02.343262 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567020-n5xz4" event={"ID":"e34ee163-b675-44d0-bb6f-25c122afd2fc","Type":"ContainerDied","Data":"429949b1be87c41cc93dc959fbb63b342cddf89af82c9cb86a7a4dd0323c8cc6"} Mar 20 15:40:03 crc kubenswrapper[4779]: I0320 15:40:03.544703 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-bdf54d65-pvrsc" Mar 20 15:40:03 crc kubenswrapper[4779]: I0320 15:40:03.660177 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567020-n5xz4" Mar 20 15:40:03 crc kubenswrapper[4779]: I0320 15:40:03.843285 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jg8d\" (UniqueName: \"kubernetes.io/projected/e34ee163-b675-44d0-bb6f-25c122afd2fc-kube-api-access-2jg8d\") pod \"e34ee163-b675-44d0-bb6f-25c122afd2fc\" (UID: \"e34ee163-b675-44d0-bb6f-25c122afd2fc\") " Mar 20 15:40:03 crc kubenswrapper[4779]: I0320 15:40:03.847835 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34ee163-b675-44d0-bb6f-25c122afd2fc-kube-api-access-2jg8d" (OuterVolumeSpecName: "kube-api-access-2jg8d") pod "e34ee163-b675-44d0-bb6f-25c122afd2fc" (UID: "e34ee163-b675-44d0-bb6f-25c122afd2fc"). InnerVolumeSpecName "kube-api-access-2jg8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:40:03 crc kubenswrapper[4779]: I0320 15:40:03.944908 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jg8d\" (UniqueName: \"kubernetes.io/projected/e34ee163-b675-44d0-bb6f-25c122afd2fc-kube-api-access-2jg8d\") on node \"crc\" DevicePath \"\"" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.168138 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cgcmm"] Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.168628 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34ee163-b675-44d0-bb6f-25c122afd2fc" containerName="oc" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.168641 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34ee163-b675-44d0-bb6f-25c122afd2fc" containerName="oc" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.168753 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34ee163-b675-44d0-bb6f-25c122afd2fc" containerName="oc" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.171610 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.174994 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.175217 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.175337 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-b9xst" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.223202 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x"] Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.224571 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.229617 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.246381 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x"] Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249320 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-sockets\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249370 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-startup\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249406 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78b75cd5-cd83-4a96-9762-95e11ae40119-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7sp5x\" (UID: \"78b75cd5-cd83-4a96-9762-95e11ae40119\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249441 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249508 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics-certs\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249534 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2vv2\" (UniqueName: \"kubernetes.io/projected/78b75cd5-cd83-4a96-9762-95e11ae40119-kube-api-access-g2vv2\") pod \"frr-k8s-webhook-server-bcc4b6f68-7sp5x\" (UID: \"78b75cd5-cd83-4a96-9762-95e11ae40119\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249562 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz64z\" (UniqueName: \"kubernetes.io/projected/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-kube-api-access-tz64z\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249621 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-reloader\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.249654 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-conf\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.268320 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8h6gl"] Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.269582 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.275939 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.276259 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-k4sks" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.278187 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.279282 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-d7l7m"] Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.280468 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.280993 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.281823 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.295503 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-d7l7m"] Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350324 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfppc\" (UniqueName: \"kubernetes.io/projected/6a069bff-512f-4d9a-931b-f558cea5f3b7-kube-api-access-xfppc\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350391 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-metrics-certs\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350419 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/38d17ea2-3f89-40c5-bf9d-e950220427b5-metallb-excludel2\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350452 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics-certs\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350480 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2vv2\" (UniqueName: \"kubernetes.io/projected/78b75cd5-cd83-4a96-9762-95e11ae40119-kube-api-access-g2vv2\") pod \"frr-k8s-webhook-server-bcc4b6f68-7sp5x\" (UID: \"78b75cd5-cd83-4a96-9762-95e11ae40119\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350520 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz64z\" (UniqueName: \"kubernetes.io/projected/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-kube-api-access-tz64z\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350543 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-reloader\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350576 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-conf\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350596 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350624 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-sockets\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350649 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-startup\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350677 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-metrics-certs\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350697 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78b75cd5-cd83-4a96-9762-95e11ae40119-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7sp5x\" (UID: \"78b75cd5-cd83-4a96-9762-95e11ae40119\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350722 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-cert\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350750 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.350774 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpv8b\" (UniqueName: \"kubernetes.io/projected/38d17ea2-3f89-40c5-bf9d-e950220427b5-kube-api-access-dpv8b\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.350935 4779 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.350985 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics-certs podName:18e6cd6e-0782-4b72-a968-bc2e1e6d027b nodeName:}" failed. No retries permitted until 2026-03-20 15:40:04.850965684 +0000 UTC m=+1021.813481484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics-certs") pod "frr-k8s-cgcmm" (UID: "18e6cd6e-0782-4b72-a968-bc2e1e6d027b") : secret "frr-k8s-certs-secret" not found Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.351772 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-reloader\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.351976 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-conf\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.352208 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-sockets\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.352961 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-frr-startup\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.354158 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.375137 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78b75cd5-cd83-4a96-9762-95e11ae40119-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7sp5x\" (UID: \"78b75cd5-cd83-4a96-9762-95e11ae40119\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.381933 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz64z\" (UniqueName: \"kubernetes.io/projected/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-kube-api-access-tz64z\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.384124 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2vv2\" (UniqueName: \"kubernetes.io/projected/78b75cd5-cd83-4a96-9762-95e11ae40119-kube-api-access-g2vv2\") pod \"frr-k8s-webhook-server-bcc4b6f68-7sp5x\" (UID: \"78b75cd5-cd83-4a96-9762-95e11ae40119\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.394294 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567020-n5xz4" event={"ID":"e34ee163-b675-44d0-bb6f-25c122afd2fc","Type":"ContainerDied","Data":"68ee578411a90322e7429eeef82f3ee292f40d275e0a536db218ee200194cb84"} Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.394334 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68ee578411a90322e7429eeef82f3ee292f40d275e0a536db218ee200194cb84" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.394444 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567020-n5xz4" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.451146 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/38d17ea2-3f89-40c5-bf9d-e950220427b5-metallb-excludel2\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.451250 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.451298 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-metrics-certs\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.451328 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-cert\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.451364 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpv8b\" (UniqueName: \"kubernetes.io/projected/38d17ea2-3f89-40c5-bf9d-e950220427b5-kube-api-access-dpv8b\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.451394 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfppc\" (UniqueName: \"kubernetes.io/projected/6a069bff-512f-4d9a-931b-f558cea5f3b7-kube-api-access-xfppc\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.451439 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-metrics-certs\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.451562 4779 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.451619 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-metrics-certs podName:6a069bff-512f-4d9a-931b-f558cea5f3b7 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:04.951601103 +0000 UTC m=+1021.914116903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-metrics-certs") pod "controller-7bb4cc7c98-d7l7m" (UID: "6a069bff-512f-4d9a-931b-f558cea5f3b7") : secret "controller-certs-secret" not found Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.451876 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/38d17ea2-3f89-40c5-bf9d-e950220427b5-metallb-excludel2\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.451908 4779 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.451943 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist podName:38d17ea2-3f89-40c5-bf9d-e950220427b5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:04.951930862 +0000 UTC m=+1021.914446752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist") pod "speaker-8h6gl" (UID: "38d17ea2-3f89-40c5-bf9d-e950220427b5") : secret "metallb-memberlist" not found Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.452101 4779 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.452173 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-metrics-certs podName:38d17ea2-3f89-40c5-bf9d-e950220427b5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:04.952156988 +0000 UTC m=+1021.914672788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-metrics-certs") pod "speaker-8h6gl" (UID: "38d17ea2-3f89-40c5-bf9d-e950220427b5") : secret "speaker-certs-secret" not found Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.453918 4779 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.465958 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-cert\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.471091 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfppc\" (UniqueName: \"kubernetes.io/projected/6a069bff-512f-4d9a-931b-f558cea5f3b7-kube-api-access-xfppc\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.471415 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpv8b\" (UniqueName: \"kubernetes.io/projected/38d17ea2-3f89-40c5-bf9d-e950220427b5-kube-api-access-dpv8b\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.573876 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.721915 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567014-8w9sr"] Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.728361 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567014-8w9sr"] Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.856186 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics-certs\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.862027 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e6cd6e-0782-4b72-a968-bc2e1e6d027b-metrics-certs\") pod \"frr-k8s-cgcmm\" (UID: \"18e6cd6e-0782-4b72-a968-bc2e1e6d027b\") " pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.957801 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.957857 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-metrics-certs\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.957910 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-metrics-certs\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.957980 4779 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 15:40:04 crc kubenswrapper[4779]: E0320 15:40:04.958054 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist podName:38d17ea2-3f89-40c5-bf9d-e950220427b5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:05.958036836 +0000 UTC m=+1022.920552636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist") pod "speaker-8h6gl" (UID: "38d17ea2-3f89-40c5-bf9d-e950220427b5") : secret "metallb-memberlist" not found Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.965598 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-metrics-certs\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.965832 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a069bff-512f-4d9a-931b-f558cea5f3b7-metrics-certs\") pod \"controller-7bb4cc7c98-d7l7m\" (UID: \"6a069bff-512f-4d9a-931b-f558cea5f3b7\") " pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:04 crc kubenswrapper[4779]: I0320 15:40:04.974349 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x"] Mar 20 15:40:04 crc kubenswrapper[4779]: W0320 15:40:04.979511 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78b75cd5_cd83_4a96_9762_95e11ae40119.slice/crio-ea6dc3cb5a574c3971acccefbc833d424087bcd1f16c949294433229f2fc2f17 WatchSource:0}: Error finding container ea6dc3cb5a574c3971acccefbc833d424087bcd1f16c949294433229f2fc2f17: Status 404 returned error can't find the container with id ea6dc3cb5a574c3971acccefbc833d424087bcd1f16c949294433229f2fc2f17 Mar 20 15:40:05 crc kubenswrapper[4779]: I0320 15:40:05.098425 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:05 crc kubenswrapper[4779]: I0320 15:40:05.221409 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:05 crc kubenswrapper[4779]: I0320 15:40:05.406030 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" event={"ID":"78b75cd5-cd83-4a96-9762-95e11ae40119","Type":"ContainerStarted","Data":"ea6dc3cb5a574c3971acccefbc833d424087bcd1f16c949294433229f2fc2f17"} Mar 20 15:40:05 crc kubenswrapper[4779]: I0320 15:40:05.407181 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerStarted","Data":"2fd03dbe51f2f5dbba10dae1b0acf39be1c53a99b38be013ef3cc456271a43b5"} Mar 20 15:40:05 crc kubenswrapper[4779]: I0320 15:40:05.604047 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-d7l7m"] Mar 20 15:40:05 crc kubenswrapper[4779]: W0320 15:40:05.609405 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a069bff_512f_4d9a_931b_f558cea5f3b7.slice/crio-18cb1b22207ebef5c0952d55ed78b15bc4fe6cb42891e722f8ffadd70cfba201 WatchSource:0}: Error finding container 18cb1b22207ebef5c0952d55ed78b15bc4fe6cb42891e722f8ffadd70cfba201: Status 404 returned error can't find the container with id 18cb1b22207ebef5c0952d55ed78b15bc4fe6cb42891e722f8ffadd70cfba201 Mar 20 15:40:05 crc kubenswrapper[4779]: I0320 15:40:05.819388 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fd8882-ac45-4e57-b6fb-c683861c992d" path="/var/lib/kubelet/pods/96fd8882-ac45-4e57-b6fb-c683861c992d/volumes" Mar 20 15:40:05 crc kubenswrapper[4779]: I0320 15:40:05.974462 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:05 crc kubenswrapper[4779]: I0320 15:40:05.979748 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38d17ea2-3f89-40c5-bf9d-e950220427b5-memberlist\") pod \"speaker-8h6gl\" (UID: \"38d17ea2-3f89-40c5-bf9d-e950220427b5\") " pod="metallb-system/speaker-8h6gl" Mar 20 15:40:06 crc kubenswrapper[4779]: I0320 15:40:06.103036 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8h6gl" Mar 20 15:40:06 crc kubenswrapper[4779]: W0320 15:40:06.122503 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d17ea2_3f89_40c5_bf9d_e950220427b5.slice/crio-b3fa86459162c34bc449a13376de8c57e988e68ad889bffa6553ea0aa34a35ff WatchSource:0}: Error finding container b3fa86459162c34bc449a13376de8c57e988e68ad889bffa6553ea0aa34a35ff: Status 404 returned error can't find the container with id b3fa86459162c34bc449a13376de8c57e988e68ad889bffa6553ea0aa34a35ff Mar 20 15:40:06 crc kubenswrapper[4779]: I0320 15:40:06.415055 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h6gl" event={"ID":"38d17ea2-3f89-40c5-bf9d-e950220427b5","Type":"ContainerStarted","Data":"66842c3cdc8159cfd18d67301af7de602cde649f77a04227f12dcc305691b676"} Mar 20 15:40:06 crc kubenswrapper[4779]: I0320 15:40:06.415428 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h6gl" event={"ID":"38d17ea2-3f89-40c5-bf9d-e950220427b5","Type":"ContainerStarted","Data":"b3fa86459162c34bc449a13376de8c57e988e68ad889bffa6553ea0aa34a35ff"} Mar 20 15:40:06 crc kubenswrapper[4779]: I0320 15:40:06.416866 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-d7l7m" event={"ID":"6a069bff-512f-4d9a-931b-f558cea5f3b7","Type":"ContainerStarted","Data":"2ac7b9b4beffef31d11cab8786f40e6886546d54b91fa2653b61f405ff0b863e"} Mar 20 15:40:06 crc kubenswrapper[4779]: I0320 15:40:06.416892 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-d7l7m" event={"ID":"6a069bff-512f-4d9a-931b-f558cea5f3b7","Type":"ContainerStarted","Data":"e8c185df0af8e2a1cd0fea49d17140436f616105b95e0fde9deb7a4680b495ea"} Mar 20 15:40:06 crc kubenswrapper[4779]: I0320 15:40:06.416901 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-d7l7m" event={"ID":"6a069bff-512f-4d9a-931b-f558cea5f3b7","Type":"ContainerStarted","Data":"18cb1b22207ebef5c0952d55ed78b15bc4fe6cb42891e722f8ffadd70cfba201"} Mar 20 15:40:06 crc kubenswrapper[4779]: I0320 15:40:06.417692 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:06 crc kubenswrapper[4779]: I0320 15:40:06.433854 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-d7l7m" podStartSLOduration=2.433838728 podStartE2EDuration="2.433838728s" podCreationTimestamp="2026-03-20 15:40:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:40:06.431917941 +0000 UTC m=+1023.394433751" watchObservedRunningTime="2026-03-20 15:40:06.433838728 +0000 UTC m=+1023.396354528" Mar 20 15:40:07 crc kubenswrapper[4779]: I0320 15:40:07.440349 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h6gl" event={"ID":"38d17ea2-3f89-40c5-bf9d-e950220427b5","Type":"ContainerStarted","Data":"b834ade351c284c882db202d71eb5219eba20c13f165b67a0a0c596d0540feff"} Mar 20 15:40:07 crc kubenswrapper[4779]: I0320 15:40:07.440408 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8h6gl" Mar 20 15:40:07 crc kubenswrapper[4779]: I0320 15:40:07.462614 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8h6gl" podStartSLOduration=3.46259685 podStartE2EDuration="3.46259685s" podCreationTimestamp="2026-03-20 15:40:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:40:07.460178559 +0000 UTC m=+1024.422694369" watchObservedRunningTime="2026-03-20 15:40:07.46259685 +0000 UTC m=+1024.425112640" Mar 20 15:40:13 crc kubenswrapper[4779]: I0320 15:40:13.481579 4779 generic.go:334] "Generic (PLEG): container finished" podID="18e6cd6e-0782-4b72-a968-bc2e1e6d027b" containerID="fc99f013c4f6d7b09f53cb3a8e35634b41b92037672f697a039f4dbe2c2165dc" exitCode=0 Mar 20 15:40:13 crc kubenswrapper[4779]: I0320 15:40:13.481680 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerDied","Data":"fc99f013c4f6d7b09f53cb3a8e35634b41b92037672f697a039f4dbe2c2165dc"} Mar 20 15:40:13 crc kubenswrapper[4779]: I0320 15:40:13.483649 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" event={"ID":"78b75cd5-cd83-4a96-9762-95e11ae40119","Type":"ContainerStarted","Data":"6ab6feede53c086acfc1faa1326ba09044b63ec53469d5ee15c7236730daa907"} Mar 20 15:40:13 crc kubenswrapper[4779]: I0320 15:40:13.483791 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:13 crc kubenswrapper[4779]: I0320 15:40:13.523507 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" podStartSLOduration=1.297238477 podStartE2EDuration="9.523486293s" podCreationTimestamp="2026-03-20 15:40:04 +0000 UTC" firstStartedPulling="2026-03-20 15:40:04.981887191 +0000 UTC m=+1021.944402991" lastFinishedPulling="2026-03-20 15:40:13.208135007 +0000 UTC m=+1030.170650807" observedRunningTime="2026-03-20 15:40:13.518328794 +0000 UTC m=+1030.480844594" watchObservedRunningTime="2026-03-20 15:40:13.523486293 +0000 UTC m=+1030.486002093" Mar 20 15:40:14 crc kubenswrapper[4779]: I0320 15:40:14.501024 4779 generic.go:334] "Generic (PLEG): container finished" podID="18e6cd6e-0782-4b72-a968-bc2e1e6d027b" containerID="7016adca817ba7716b7feccca8ba37c3df0339609d8efdb9172cdc10c41d93aa" exitCode=0 Mar 20 15:40:14 crc kubenswrapper[4779]: I0320 15:40:14.501155 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerDied","Data":"7016adca817ba7716b7feccca8ba37c3df0339609d8efdb9172cdc10c41d93aa"} Mar 20 15:40:15 crc kubenswrapper[4779]: I0320 15:40:15.226228 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-d7l7m" Mar 20 15:40:15 crc kubenswrapper[4779]: I0320 15:40:15.509054 4779 generic.go:334] "Generic (PLEG): container finished" podID="18e6cd6e-0782-4b72-a968-bc2e1e6d027b" containerID="abd6443016b4d6557689c252242d2464c2f9d4fe8ce2b28dbefd0e8ef32164c7" exitCode=0 Mar 20 15:40:15 crc kubenswrapper[4779]: I0320 15:40:15.509098 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerDied","Data":"abd6443016b4d6557689c252242d2464c2f9d4fe8ce2b28dbefd0e8ef32164c7"} Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.107959 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8h6gl" Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.521701 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerStarted","Data":"051bcf1adfc47665e28db9e5ed6ec3c7c60101c82b250fe785fcffb6b026b2fd"} Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.521744 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerStarted","Data":"dd2e15016769c84d3d4c086a777a1081255784e6176f40c943c91501803dff27"} Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.521754 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerStarted","Data":"634ce2a4508e5461b9f3ebf5ee94f6d5c2c4fdeb196bcab9f723368ce4a06bc0"} Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.521762 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerStarted","Data":"16dbe4063eec5303d89f6c2a050eac636e98801a943e152d883c2a5dc72ee215"} Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.521769 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerStarted","Data":"edcedacbcb514f91aaca76a6ed66b5240de7adc92b4f2c68866a741d16e15fd3"} Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.521777 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgcmm" event={"ID":"18e6cd6e-0782-4b72-a968-bc2e1e6d027b","Type":"ContainerStarted","Data":"3c8d75e835c37b2878cb79990c229f24362db329b5ae8490ebb7c73d7361346e"} Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.522643 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:16 crc kubenswrapper[4779]: I0320 15:40:16.545291 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cgcmm" podStartSLOduration=4.523268348 podStartE2EDuration="12.545269368s" podCreationTimestamp="2026-03-20 15:40:04 +0000 UTC" firstStartedPulling="2026-03-20 15:40:05.204891774 +0000 UTC m=+1022.167407574" lastFinishedPulling="2026-03-20 15:40:13.226892794 +0000 UTC m=+1030.189408594" observedRunningTime="2026-03-20 15:40:16.540836867 +0000 UTC m=+1033.503352677" watchObservedRunningTime="2026-03-20 15:40:16.545269368 +0000 UTC m=+1033.507785168" Mar 20 15:40:19 crc kubenswrapper[4779]: I0320 15:40:19.135535 4779 scope.go:117] "RemoveContainer" containerID="0ffb20c433de882c373cf689e4f93f0b2efad7f41df5a0d180156d892c1db3be" Mar 20 15:40:20 crc kubenswrapper[4779]: I0320 15:40:20.098939 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:20 crc kubenswrapper[4779]: I0320 15:40:20.135205 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:22 crc kubenswrapper[4779]: I0320 15:40:22.800066 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2nnq6"] Mar 20 15:40:22 crc kubenswrapper[4779]: I0320 15:40:22.800975 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:22 crc kubenswrapper[4779]: I0320 15:40:22.803792 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-wlxdt" Mar 20 15:40:22 crc kubenswrapper[4779]: I0320 15:40:22.803839 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 15:40:22 crc kubenswrapper[4779]: I0320 15:40:22.813456 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 15:40:22 crc kubenswrapper[4779]: I0320 15:40:22.827248 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2nnq6"] Mar 20 15:40:22 crc kubenswrapper[4779]: I0320 15:40:22.890718 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjgp\" (UniqueName: \"kubernetes.io/projected/cf023fa6-f2a8-411b-820c-6b3822b8ce68-kube-api-access-lfjgp\") pod \"openstack-operator-index-2nnq6\" (UID: \"cf023fa6-f2a8-411b-820c-6b3822b8ce68\") " pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:22 crc kubenswrapper[4779]: I0320 15:40:22.992211 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjgp\" (UniqueName: \"kubernetes.io/projected/cf023fa6-f2a8-411b-820c-6b3822b8ce68-kube-api-access-lfjgp\") pod \"openstack-operator-index-2nnq6\" (UID: \"cf023fa6-f2a8-411b-820c-6b3822b8ce68\") " pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:23 crc kubenswrapper[4779]: I0320 15:40:23.009342 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjgp\" (UniqueName: \"kubernetes.io/projected/cf023fa6-f2a8-411b-820c-6b3822b8ce68-kube-api-access-lfjgp\") pod \"openstack-operator-index-2nnq6\" (UID: \"cf023fa6-f2a8-411b-820c-6b3822b8ce68\") " pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:23 crc kubenswrapper[4779]: I0320 15:40:23.127806 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:23 crc kubenswrapper[4779]: I0320 15:40:23.514343 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2nnq6"] Mar 20 15:40:23 crc kubenswrapper[4779]: I0320 15:40:23.595435 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2nnq6" event={"ID":"cf023fa6-f2a8-411b-820c-6b3822b8ce68","Type":"ContainerStarted","Data":"a3cbcbbc289b234deee2f611a8bc22133ea909b8039241e48682d17f02653a90"} Mar 20 15:40:24 crc kubenswrapper[4779]: I0320 15:40:24.577483 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7sp5x" Mar 20 15:40:25 crc kubenswrapper[4779]: I0320 15:40:25.104004 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cgcmm" Mar 20 15:40:25 crc kubenswrapper[4779]: I0320 15:40:25.150007 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:40:25 crc kubenswrapper[4779]: I0320 15:40:25.150076 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:40:26 crc kubenswrapper[4779]: I0320 15:40:26.630864 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2nnq6" event={"ID":"cf023fa6-f2a8-411b-820c-6b3822b8ce68","Type":"ContainerStarted","Data":"e551de0befa402a8dafbcd79a6f3060eca6f7878b28b1889dfe2de0845a267eb"} Mar 20 15:40:26 crc kubenswrapper[4779]: I0320 15:40:26.648237 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2nnq6" podStartSLOduration=2.55285677 podStartE2EDuration="4.648219393s" podCreationTimestamp="2026-03-20 15:40:22 +0000 UTC" firstStartedPulling="2026-03-20 15:40:23.521594907 +0000 UTC m=+1040.484110707" lastFinishedPulling="2026-03-20 15:40:25.61695753 +0000 UTC m=+1042.579473330" observedRunningTime="2026-03-20 15:40:26.646592611 +0000 UTC m=+1043.609108411" watchObservedRunningTime="2026-03-20 15:40:26.648219393 +0000 UTC m=+1043.610735193" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.404933 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fdmsg"] Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.406745 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.416974 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdmsg"] Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.589668 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-utilities\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.589743 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-catalog-content\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.589776 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjp69\" (UniqueName: \"kubernetes.io/projected/71825291-b7e6-4155-8d83-b91fce80e461-kube-api-access-jjp69\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.691232 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-utilities\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.691369 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-catalog-content\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.691406 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjp69\" (UniqueName: \"kubernetes.io/projected/71825291-b7e6-4155-8d83-b91fce80e461-kube-api-access-jjp69\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.691772 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-utilities\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.692225 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-catalog-content\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.710868 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjp69\" (UniqueName: \"kubernetes.io/projected/71825291-b7e6-4155-8d83-b91fce80e461-kube-api-access-jjp69\") pod \"redhat-marketplace-fdmsg\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:29 crc kubenswrapper[4779]: I0320 15:40:29.724331 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:30 crc kubenswrapper[4779]: I0320 15:40:30.131380 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdmsg"] Mar 20 15:40:30 crc kubenswrapper[4779]: W0320 15:40:30.136575 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71825291_b7e6_4155_8d83_b91fce80e461.slice/crio-a7f63a854f56deaded60c68e57e19ad29c8dd985a3d2cfaf11f459b1e4d7c40e WatchSource:0}: Error finding container a7f63a854f56deaded60c68e57e19ad29c8dd985a3d2cfaf11f459b1e4d7c40e: Status 404 returned error can't find the container with id a7f63a854f56deaded60c68e57e19ad29c8dd985a3d2cfaf11f459b1e4d7c40e Mar 20 15:40:30 crc kubenswrapper[4779]: I0320 15:40:30.654419 4779 generic.go:334] "Generic (PLEG): container finished" podID="71825291-b7e6-4155-8d83-b91fce80e461" containerID="fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7" exitCode=0 Mar 20 15:40:30 crc kubenswrapper[4779]: I0320 15:40:30.654477 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdmsg" event={"ID":"71825291-b7e6-4155-8d83-b91fce80e461","Type":"ContainerDied","Data":"fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7"} Mar 20 15:40:30 crc kubenswrapper[4779]: I0320 15:40:30.654550 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdmsg" event={"ID":"71825291-b7e6-4155-8d83-b91fce80e461","Type":"ContainerStarted","Data":"a7f63a854f56deaded60c68e57e19ad29c8dd985a3d2cfaf11f459b1e4d7c40e"} Mar 20 15:40:31 crc kubenswrapper[4779]: I0320 15:40:31.662172 4779 generic.go:334] "Generic (PLEG): container finished" podID="71825291-b7e6-4155-8d83-b91fce80e461" containerID="e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429" exitCode=0 Mar 20 15:40:31 crc kubenswrapper[4779]: I0320 15:40:31.662226 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdmsg" event={"ID":"71825291-b7e6-4155-8d83-b91fce80e461","Type":"ContainerDied","Data":"e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429"} Mar 20 15:40:32 crc kubenswrapper[4779]: I0320 15:40:32.669746 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdmsg" event={"ID":"71825291-b7e6-4155-8d83-b91fce80e461","Type":"ContainerStarted","Data":"bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74"} Mar 20 15:40:32 crc kubenswrapper[4779]: I0320 15:40:32.690009 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fdmsg" podStartSLOduration=2.305520506 podStartE2EDuration="3.689989465s" podCreationTimestamp="2026-03-20 15:40:29 +0000 UTC" firstStartedPulling="2026-03-20 15:40:30.657874859 +0000 UTC m=+1047.620390649" lastFinishedPulling="2026-03-20 15:40:32.042343808 +0000 UTC m=+1049.004859608" observedRunningTime="2026-03-20 15:40:32.685086792 +0000 UTC m=+1049.647602592" watchObservedRunningTime="2026-03-20 15:40:32.689989465 +0000 UTC m=+1049.652505265" Mar 20 15:40:33 crc kubenswrapper[4779]: I0320 15:40:33.128832 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:33 crc kubenswrapper[4779]: I0320 15:40:33.128879 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:33 crc kubenswrapper[4779]: I0320 15:40:33.163922 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:33 crc kubenswrapper[4779]: I0320 15:40:33.699423 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-2nnq6" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.636218 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w"] Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.637916 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.639587 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9xqbt" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.652500 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w"] Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.791386 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-util\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.791455 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mkp\" (UniqueName: \"kubernetes.io/projected/d0f47857-a2c3-4fe2-9a59-2b178811fda5-kube-api-access-88mkp\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.791662 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-bundle\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.893385 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-util\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.893458 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mkp\" (UniqueName: \"kubernetes.io/projected/d0f47857-a2c3-4fe2-9a59-2b178811fda5-kube-api-access-88mkp\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.893518 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-bundle\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.893912 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-util\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.893928 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-bundle\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.913592 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mkp\" (UniqueName: \"kubernetes.io/projected/d0f47857-a2c3-4fe2-9a59-2b178811fda5-kube-api-access-88mkp\") pod \"7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:36 crc kubenswrapper[4779]: I0320 15:40:36.954999 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:37 crc kubenswrapper[4779]: I0320 15:40:37.358857 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w"] Mar 20 15:40:37 crc kubenswrapper[4779]: I0320 15:40:37.697293 4779 generic.go:334] "Generic (PLEG): container finished" podID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerID="3c83fbb48785aefd92e437379e54330e0e943a7828bbefc34e2f9418dbb8b481" exitCode=0 Mar 20 15:40:37 crc kubenswrapper[4779]: I0320 15:40:37.697340 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" event={"ID":"d0f47857-a2c3-4fe2-9a59-2b178811fda5","Type":"ContainerDied","Data":"3c83fbb48785aefd92e437379e54330e0e943a7828bbefc34e2f9418dbb8b481"} Mar 20 15:40:37 crc kubenswrapper[4779]: I0320 15:40:37.698252 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" event={"ID":"d0f47857-a2c3-4fe2-9a59-2b178811fda5","Type":"ContainerStarted","Data":"538914a9efed7443cced96077c20ecb6e230fd4db357b7f69734b0d25eb493d8"} Mar 20 15:40:38 crc kubenswrapper[4779]: I0320 15:40:38.706753 4779 generic.go:334] "Generic (PLEG): container finished" podID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerID="f10ca80265c1b4e2d573ad62336155dae8b18a5625e584f8a1edc14e2dd012c8" exitCode=0 Mar 20 15:40:38 crc kubenswrapper[4779]: I0320 15:40:38.706807 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" event={"ID":"d0f47857-a2c3-4fe2-9a59-2b178811fda5","Type":"ContainerDied","Data":"f10ca80265c1b4e2d573ad62336155dae8b18a5625e584f8a1edc14e2dd012c8"} Mar 20 15:40:39 crc kubenswrapper[4779]: I0320 15:40:39.717581 4779 generic.go:334] "Generic (PLEG): container finished" podID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerID="89b8a945e35d61bb5d53cdbcd97b13d3b18bc2f5aab31385e026d09ad52491c3" exitCode=0 Mar 20 15:40:39 crc kubenswrapper[4779]: I0320 15:40:39.717636 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" event={"ID":"d0f47857-a2c3-4fe2-9a59-2b178811fda5","Type":"ContainerDied","Data":"89b8a945e35d61bb5d53cdbcd97b13d3b18bc2f5aab31385e026d09ad52491c3"} Mar 20 15:40:39 crc kubenswrapper[4779]: I0320 15:40:39.724578 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:39 crc kubenswrapper[4779]: I0320 15:40:39.724722 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:39 crc kubenswrapper[4779]: I0320 15:40:39.770528 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:40 crc kubenswrapper[4779]: I0320 15:40:40.767257 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.248491 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.253464 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88mkp\" (UniqueName: \"kubernetes.io/projected/d0f47857-a2c3-4fe2-9a59-2b178811fda5-kube-api-access-88mkp\") pod \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.253496 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-bundle\") pod \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.253521 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-util\") pod \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\" (UID: \"d0f47857-a2c3-4fe2-9a59-2b178811fda5\") " Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.255463 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-bundle" (OuterVolumeSpecName: "bundle") pod "d0f47857-a2c3-4fe2-9a59-2b178811fda5" (UID: "d0f47857-a2c3-4fe2-9a59-2b178811fda5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.259774 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f47857-a2c3-4fe2-9a59-2b178811fda5-kube-api-access-88mkp" (OuterVolumeSpecName: "kube-api-access-88mkp") pod "d0f47857-a2c3-4fe2-9a59-2b178811fda5" (UID: "d0f47857-a2c3-4fe2-9a59-2b178811fda5"). InnerVolumeSpecName "kube-api-access-88mkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.278956 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-util" (OuterVolumeSpecName: "util") pod "d0f47857-a2c3-4fe2-9a59-2b178811fda5" (UID: "d0f47857-a2c3-4fe2-9a59-2b178811fda5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.354432 4779 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-util\") on node \"crc\" DevicePath \"\"" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.354627 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88mkp\" (UniqueName: \"kubernetes.io/projected/d0f47857-a2c3-4fe2-9a59-2b178811fda5-kube-api-access-88mkp\") on node \"crc\" DevicePath \"\"" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.354687 4779 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0f47857-a2c3-4fe2-9a59-2b178811fda5-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.394647 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdmsg"] Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.735593 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" event={"ID":"d0f47857-a2c3-4fe2-9a59-2b178811fda5","Type":"ContainerDied","Data":"538914a9efed7443cced96077c20ecb6e230fd4db357b7f69734b0d25eb493d8"} Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.735632 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="538914a9efed7443cced96077c20ecb6e230fd4db357b7f69734b0d25eb493d8" Mar 20 15:40:41 crc kubenswrapper[4779]: I0320 15:40:41.735652 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w" Mar 20 15:40:42 crc kubenswrapper[4779]: I0320 15:40:42.743320 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fdmsg" podUID="71825291-b7e6-4155-8d83-b91fce80e461" containerName="registry-server" containerID="cri-o://bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74" gracePeriod=2 Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.168828 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.285639 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-catalog-content\") pod \"71825291-b7e6-4155-8d83-b91fce80e461\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.285681 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-utilities\") pod \"71825291-b7e6-4155-8d83-b91fce80e461\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.285753 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjp69\" (UniqueName: \"kubernetes.io/projected/71825291-b7e6-4155-8d83-b91fce80e461-kube-api-access-jjp69\") pod \"71825291-b7e6-4155-8d83-b91fce80e461\" (UID: \"71825291-b7e6-4155-8d83-b91fce80e461\") " Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.286446 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-utilities" (OuterVolumeSpecName: "utilities") pod "71825291-b7e6-4155-8d83-b91fce80e461" (UID: "71825291-b7e6-4155-8d83-b91fce80e461"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.289448 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71825291-b7e6-4155-8d83-b91fce80e461-kube-api-access-jjp69" (OuterVolumeSpecName: "kube-api-access-jjp69") pod "71825291-b7e6-4155-8d83-b91fce80e461" (UID: "71825291-b7e6-4155-8d83-b91fce80e461"). InnerVolumeSpecName "kube-api-access-jjp69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.309594 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71825291-b7e6-4155-8d83-b91fce80e461" (UID: "71825291-b7e6-4155-8d83-b91fce80e461"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.387562 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.387626 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71825291-b7e6-4155-8d83-b91fce80e461-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.387650 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjp69\" (UniqueName: \"kubernetes.io/projected/71825291-b7e6-4155-8d83-b91fce80e461-kube-api-access-jjp69\") on node \"crc\" DevicePath \"\"" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.752831 4779 generic.go:334] "Generic (PLEG): container finished" podID="71825291-b7e6-4155-8d83-b91fce80e461" containerID="bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74" exitCode=0 Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.752885 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdmsg" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.752892 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdmsg" event={"ID":"71825291-b7e6-4155-8d83-b91fce80e461","Type":"ContainerDied","Data":"bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74"} Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.752927 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdmsg" event={"ID":"71825291-b7e6-4155-8d83-b91fce80e461","Type":"ContainerDied","Data":"a7f63a854f56deaded60c68e57e19ad29c8dd985a3d2cfaf11f459b1e4d7c40e"} Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.752953 4779 scope.go:117] "RemoveContainer" containerID="bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.770323 4779 scope.go:117] "RemoveContainer" containerID="e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.788244 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdmsg"] Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.788825 4779 scope.go:117] "RemoveContainer" containerID="fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.796086 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdmsg"] Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.809278 4779 scope.go:117] "RemoveContainer" containerID="bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74" Mar 20 15:40:43 crc kubenswrapper[4779]: E0320 15:40:43.809681 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74\": container with ID starting with bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74 not found: ID does not exist" containerID="bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.809725 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74"} err="failed to get container status \"bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74\": rpc error: code = NotFound desc = could not find container \"bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74\": container with ID starting with bfd6105aa0071d8672405efda1da7138ef6f5b405155d4f1dead28114e64ab74 not found: ID does not exist" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.809755 4779 scope.go:117] "RemoveContainer" containerID="e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429" Mar 20 15:40:43 crc kubenswrapper[4779]: E0320 15:40:43.810254 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429\": container with ID starting with e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429 not found: ID does not exist" containerID="e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.810287 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429"} err="failed to get container status \"e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429\": rpc error: code = NotFound desc = could not find container \"e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429\": container with ID starting with e832b048416e75e3bb345db4757c0273fd73146fe4d12d4f506f86d274150429 not found: ID does not exist" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.810308 4779 scope.go:117] "RemoveContainer" containerID="fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7" Mar 20 15:40:43 crc kubenswrapper[4779]: E0320 15:40:43.810532 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7\": container with ID starting with fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7 not found: ID does not exist" containerID="fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.810553 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7"} err="failed to get container status \"fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7\": rpc error: code = NotFound desc = could not find container \"fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7\": container with ID starting with fe08c0f209bcfe613314188c269aaa1d06149ee6b503a8c5f086a319e17ccfb7 not found: ID does not exist" Mar 20 15:40:43 crc kubenswrapper[4779]: I0320 15:40:43.816494 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71825291-b7e6-4155-8d83-b91fce80e461" path="/var/lib/kubelet/pods/71825291-b7e6-4155-8d83-b91fce80e461/volumes" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.833631 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d"] Mar 20 15:40:48 crc kubenswrapper[4779]: E0320 15:40:48.834347 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerName="pull" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834359 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerName="pull" Mar 20 15:40:48 crc kubenswrapper[4779]: E0320 15:40:48.834371 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerName="extract" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834379 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerName="extract" Mar 20 15:40:48 crc kubenswrapper[4779]: E0320 15:40:48.834394 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71825291-b7e6-4155-8d83-b91fce80e461" containerName="extract-content" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834400 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="71825291-b7e6-4155-8d83-b91fce80e461" containerName="extract-content" Mar 20 15:40:48 crc kubenswrapper[4779]: E0320 15:40:48.834407 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71825291-b7e6-4155-8d83-b91fce80e461" containerName="extract-utilities" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834413 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="71825291-b7e6-4155-8d83-b91fce80e461" containerName="extract-utilities" Mar 20 15:40:48 crc kubenswrapper[4779]: E0320 15:40:48.834425 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerName="util" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834431 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerName="util" Mar 20 15:40:48 crc kubenswrapper[4779]: E0320 15:40:48.834437 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71825291-b7e6-4155-8d83-b91fce80e461" containerName="registry-server" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834442 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="71825291-b7e6-4155-8d83-b91fce80e461" containerName="registry-server" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834537 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f47857-a2c3-4fe2-9a59-2b178811fda5" containerName="extract" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834555 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="71825291-b7e6-4155-8d83-b91fce80e461" containerName="registry-server" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.834950 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.836705 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-vhm2d" Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.864896 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d"] Mar 20 15:40:48 crc kubenswrapper[4779]: I0320 15:40:48.960243 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfpz7\" (UniqueName: \"kubernetes.io/projected/8b904c14-dc4f-41a0-afab-990ff14e74f7-kube-api-access-cfpz7\") pod \"openstack-operator-controller-init-65c968fb4-5gx4d\" (UID: \"8b904c14-dc4f-41a0-afab-990ff14e74f7\") " pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" Mar 20 15:40:49 crc kubenswrapper[4779]: I0320 15:40:49.062340 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfpz7\" (UniqueName: \"kubernetes.io/projected/8b904c14-dc4f-41a0-afab-990ff14e74f7-kube-api-access-cfpz7\") pod \"openstack-operator-controller-init-65c968fb4-5gx4d\" (UID: \"8b904c14-dc4f-41a0-afab-990ff14e74f7\") " pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" Mar 20 15:40:49 crc kubenswrapper[4779]: I0320 15:40:49.083650 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfpz7\" (UniqueName: \"kubernetes.io/projected/8b904c14-dc4f-41a0-afab-990ff14e74f7-kube-api-access-cfpz7\") pod \"openstack-operator-controller-init-65c968fb4-5gx4d\" (UID: \"8b904c14-dc4f-41a0-afab-990ff14e74f7\") " pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" Mar 20 15:40:49 crc kubenswrapper[4779]: I0320 15:40:49.152390 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" Mar 20 15:40:49 crc kubenswrapper[4779]: I0320 15:40:49.397794 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d"] Mar 20 15:40:49 crc kubenswrapper[4779]: I0320 15:40:49.792613 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" event={"ID":"8b904c14-dc4f-41a0-afab-990ff14e74f7","Type":"ContainerStarted","Data":"fec2fc18e93909798eeb2b413c551e0ae166562772de5bd9f1c0f296093f2c63"} Mar 20 15:40:53 crc kubenswrapper[4779]: I0320 15:40:53.825989 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" event={"ID":"8b904c14-dc4f-41a0-afab-990ff14e74f7","Type":"ContainerStarted","Data":"b54ab707b73798baf50267daa2ea3c52efdd145a6d4cc1e99aae569edb6ef905"} Mar 20 15:40:53 crc kubenswrapper[4779]: I0320 15:40:53.826571 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" Mar 20 15:40:53 crc kubenswrapper[4779]: I0320 15:40:53.886326 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" podStartSLOduration=2.627961914 podStartE2EDuration="5.886303797s" podCreationTimestamp="2026-03-20 15:40:48 +0000 UTC" firstStartedPulling="2026-03-20 15:40:49.415493949 +0000 UTC m=+1066.378009749" lastFinishedPulling="2026-03-20 15:40:52.673835832 +0000 UTC m=+1069.636351632" observedRunningTime="2026-03-20 15:40:53.877715613 +0000 UTC m=+1070.840231423" watchObservedRunningTime="2026-03-20 15:40:53.886303797 +0000 UTC m=+1070.848819607" Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.149576 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.149846 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.149882 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.150575 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d329625e139412c780d56ee2166e7046f52ce6cce2fdaf917fadbaf979acbe6a"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.150630 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://d329625e139412c780d56ee2166e7046f52ce6cce2fdaf917fadbaf979acbe6a" gracePeriod=600 Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.844049 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="d329625e139412c780d56ee2166e7046f52ce6cce2fdaf917fadbaf979acbe6a" exitCode=0 Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.844104 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"d329625e139412c780d56ee2166e7046f52ce6cce2fdaf917fadbaf979acbe6a"} Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.844509 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"fee94c09103d4b2b4bf88d7de3451b803a74dae16bc0136f3fa09b23c09cfe64"} Mar 20 15:40:55 crc kubenswrapper[4779]: I0320 15:40:55.844543 4779 scope.go:117] "RemoveContainer" containerID="6987ea5d5bc8dacf20b1f5a9f2e4c3b448070d7ac94c29c0b4e57a5ce5ecc6b5" Mar 20 15:40:59 crc kubenswrapper[4779]: I0320 15:40:59.154815 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65c968fb4-5gx4d" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.065078 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.066383 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.068461 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-r2gfg" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.073790 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.074688 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.077884 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nt8np" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.083763 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.084657 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.086423 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4pxg5" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.094789 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.119601 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.139311 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.140415 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.144573 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5f28w" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.157813 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.166794 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.167729 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.170974 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xkfnc" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.183790 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.186853 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r96lt\" (UniqueName: \"kubernetes.io/projected/c3e44c59-8e80-4e8c-956f-10b8091f819f-kube-api-access-r96lt\") pod \"barbican-operator-controller-manager-59bc569d95-t6fr6\" (UID: \"c3e44c59-8e80-4e8c-956f-10b8091f819f\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.186947 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46znj\" (UniqueName: \"kubernetes.io/projected/3fd922a6-0fbd-4458-b83c-b599e2988c7a-kube-api-access-46znj\") pod \"cinder-operator-controller-manager-8d58dc466-t5d5w\" (UID: \"3fd922a6-0fbd-4458-b83c-b599e2988c7a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.187011 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5zkh\" (UniqueName: \"kubernetes.io/projected/069366bd-aca3-43b9-9335-7d98fb20d4b7-kube-api-access-g5zkh\") pod \"designate-operator-controller-manager-588d4d986b-5tdlb\" (UID: \"069366bd-aca3-43b9-9335-7d98fb20d4b7\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.187839 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.192320 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.193473 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.198867 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-hptmv" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.199356 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.200386 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.204861 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.205143 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-gkkzk" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.208186 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.208257 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.213456 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.218661 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.225430 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-l2mt8" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.226232 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.227043 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.228702 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7t8l7" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.247162 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.256305 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.270221 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.271278 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.283162 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-wj7qw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.284246 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.285186 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.287551 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-d5ttf" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.288312 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5zkh\" (UniqueName: \"kubernetes.io/projected/069366bd-aca3-43b9-9335-7d98fb20d4b7-kube-api-access-g5zkh\") pod \"designate-operator-controller-manager-588d4d986b-5tdlb\" (UID: \"069366bd-aca3-43b9-9335-7d98fb20d4b7\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.288366 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vx9x\" (UniqueName: \"kubernetes.io/projected/ddee98c8-526f-459c-a9eb-ea96bc062ff5-kube-api-access-4vx9x\") pod \"glance-operator-controller-manager-79df6bcc97-pt5hw\" (UID: \"ddee98c8-526f-459c-a9eb-ea96bc062ff5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.288423 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r96lt\" (UniqueName: \"kubernetes.io/projected/c3e44c59-8e80-4e8c-956f-10b8091f819f-kube-api-access-r96lt\") pod \"barbican-operator-controller-manager-59bc569d95-t6fr6\" (UID: \"c3e44c59-8e80-4e8c-956f-10b8091f819f\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.288457 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pv29\" (UniqueName: \"kubernetes.io/projected/679a5132-11e9-477f-954d-1eb244f67d9c-kube-api-access-6pv29\") pod \"heat-operator-controller-manager-67dd5f86f5-qgxj7\" (UID: \"679a5132-11e9-477f-954d-1eb244f67d9c\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.288474 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46znj\" (UniqueName: \"kubernetes.io/projected/3fd922a6-0fbd-4458-b83c-b599e2988c7a-kube-api-access-46znj\") pod \"cinder-operator-controller-manager-8d58dc466-t5d5w\" (UID: \"3fd922a6-0fbd-4458-b83c-b599e2988c7a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.291158 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.300649 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.319724 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46znj\" (UniqueName: \"kubernetes.io/projected/3fd922a6-0fbd-4458-b83c-b599e2988c7a-kube-api-access-46znj\") pod \"cinder-operator-controller-manager-8d58dc466-t5d5w\" (UID: \"3fd922a6-0fbd-4458-b83c-b599e2988c7a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.325813 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r96lt\" (UniqueName: \"kubernetes.io/projected/c3e44c59-8e80-4e8c-956f-10b8091f819f-kube-api-access-r96lt\") pod \"barbican-operator-controller-manager-59bc569d95-t6fr6\" (UID: \"c3e44c59-8e80-4e8c-956f-10b8091f819f\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.335329 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.336470 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.337334 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5zkh\" (UniqueName: \"kubernetes.io/projected/069366bd-aca3-43b9-9335-7d98fb20d4b7-kube-api-access-g5zkh\") pod \"designate-operator-controller-manager-588d4d986b-5tdlb\" (UID: \"069366bd-aca3-43b9-9335-7d98fb20d4b7\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.346059 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.346937 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.355345 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.355683 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-j8b9q" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.357058 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wdhhf" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.362568 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.374625 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.386543 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.391102 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hsf7s" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398459 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398525 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkj97\" (UniqueName: \"kubernetes.io/projected/ccef47f8-908b-4765-abf1-d218024c98bf-kube-api-access-bkj97\") pod \"keystone-operator-controller-manager-768b96df4c-gqx5g\" (UID: \"ccef47f8-908b-4765-abf1-d218024c98bf\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398554 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4q8d\" (UniqueName: \"kubernetes.io/projected/3b1813e8-5330-4de2-ad05-1f56fcc1cfac-kube-api-access-w4q8d\") pod \"horizon-operator-controller-manager-8464cc45fb-c8w7p\" (UID: \"3b1813e8-5330-4de2-ad05-1f56fcc1cfac\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398594 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vx9x\" (UniqueName: \"kubernetes.io/projected/ddee98c8-526f-459c-a9eb-ea96bc062ff5-kube-api-access-4vx9x\") pod \"glance-operator-controller-manager-79df6bcc97-pt5hw\" (UID: \"ddee98c8-526f-459c-a9eb-ea96bc062ff5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398625 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhnmq\" (UniqueName: \"kubernetes.io/projected/d61cd2c5-5321-418f-b506-a14210a24e95-kube-api-access-lhnmq\") pod \"ironic-operator-controller-manager-6f787dddc9-qck9t\" (UID: \"d61cd2c5-5321-418f-b506-a14210a24e95\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398667 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9hbw\" (UniqueName: \"kubernetes.io/projected/84524c2d-5bf3-42ff-ac4e-7d5aea8c9772-kube-api-access-b9hbw\") pod \"mariadb-operator-controller-manager-67ccfc9778-zgd77\" (UID: \"84524c2d-5bf3-42ff-ac4e-7d5aea8c9772\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398724 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p9zz\" (UniqueName: \"kubernetes.io/projected/3ca155df-e0dd-42b3-92bf-98a7e8037f02-kube-api-access-8p9zz\") pod \"manila-operator-controller-manager-55f864c847-ggsnk\" (UID: \"3ca155df-e0dd-42b3-92bf-98a7e8037f02\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398774 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tb8p\" (UniqueName: \"kubernetes.io/projected/edf58803-5b67-47e2-a6f1-4998820acc34-kube-api-access-6tb8p\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.398907 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pv29\" (UniqueName: \"kubernetes.io/projected/679a5132-11e9-477f-954d-1eb244f67d9c-kube-api-access-6pv29\") pod \"heat-operator-controller-manager-67dd5f86f5-qgxj7\" (UID: \"679a5132-11e9-477f-954d-1eb244f67d9c\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.402634 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.411257 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.411478 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.428627 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.433153 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vx9x\" (UniqueName: \"kubernetes.io/projected/ddee98c8-526f-459c-a9eb-ea96bc062ff5-kube-api-access-4vx9x\") pod \"glance-operator-controller-manager-79df6bcc97-pt5hw\" (UID: \"ddee98c8-526f-459c-a9eb-ea96bc062ff5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.436926 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.437714 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.439349 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ns4g9" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.442999 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.445026 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.446147 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-q4662" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.448294 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.451935 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.454715 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pv29\" (UniqueName: \"kubernetes.io/projected/679a5132-11e9-477f-954d-1eb244f67d9c-kube-api-access-6pv29\") pod \"heat-operator-controller-manager-67dd5f86f5-qgxj7\" (UID: \"679a5132-11e9-477f-954d-1eb244f67d9c\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.455996 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.458417 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cvf9x" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.461074 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.485661 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.494247 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.494721 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.502875 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503646 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tb8p\" (UniqueName: \"kubernetes.io/projected/edf58803-5b67-47e2-a6f1-4998820acc34-kube-api-access-6tb8p\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503685 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8vbt\" (UniqueName: \"kubernetes.io/projected/87fb9e77-cea0-478a-8cd9-c8c15d42065e-kube-api-access-r8vbt\") pod \"octavia-operator-controller-manager-5b9f45d989-h5k2b\" (UID: \"87fb9e77-cea0-478a-8cd9-c8c15d42065e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503724 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503749 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkj97\" (UniqueName: \"kubernetes.io/projected/ccef47f8-908b-4765-abf1-d218024c98bf-kube-api-access-bkj97\") pod \"keystone-operator-controller-manager-768b96df4c-gqx5g\" (UID: \"ccef47f8-908b-4765-abf1-d218024c98bf\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503769 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4q8d\" (UniqueName: \"kubernetes.io/projected/3b1813e8-5330-4de2-ad05-1f56fcc1cfac-kube-api-access-w4q8d\") pod \"horizon-operator-controller-manager-8464cc45fb-c8w7p\" (UID: \"3b1813e8-5330-4de2-ad05-1f56fcc1cfac\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503797 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhnmq\" (UniqueName: \"kubernetes.io/projected/d61cd2c5-5321-418f-b506-a14210a24e95-kube-api-access-lhnmq\") pod \"ironic-operator-controller-manager-6f787dddc9-qck9t\" (UID: \"d61cd2c5-5321-418f-b506-a14210a24e95\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503833 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9hbw\" (UniqueName: \"kubernetes.io/projected/84524c2d-5bf3-42ff-ac4e-7d5aea8c9772-kube-api-access-b9hbw\") pod \"mariadb-operator-controller-manager-67ccfc9778-zgd77\" (UID: \"84524c2d-5bf3-42ff-ac4e-7d5aea8c9772\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503864 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p9zz\" (UniqueName: \"kubernetes.io/projected/3ca155df-e0dd-42b3-92bf-98a7e8037f02-kube-api-access-8p9zz\") pod \"manila-operator-controller-manager-55f864c847-ggsnk\" (UID: \"3ca155df-e0dd-42b3-92bf-98a7e8037f02\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503928 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99j58\" (UniqueName: \"kubernetes.io/projected/b2a2462d-b0d7-43e7-a7ef-93a8a4e84113-kube-api-access-99j58\") pod \"neutron-operator-controller-manager-767865f676-8pcbl\" (UID: \"b2a2462d-b0d7-43e7-a7ef-93a8a4e84113\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.503964 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79h9k\" (UniqueName: \"kubernetes.io/projected/4ee9a64d-677e-4d45-8032-f33a4e91ee2c-kube-api-access-79h9k\") pod \"nova-operator-controller-manager-5d488d59fb-cqzxh\" (UID: \"4ee9a64d-677e-4d45-8032-f33a4e91ee2c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" Mar 20 15:41:18 crc kubenswrapper[4779]: E0320 15:41:18.504267 4779 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:18 crc kubenswrapper[4779]: E0320 15:41:18.504301 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert podName:edf58803-5b67-47e2-a6f1-4998820acc34 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:19.004288593 +0000 UTC m=+1095.966804393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert") pod "infra-operator-controller-manager-7b9c774f96-n5x2d" (UID: "edf58803-5b67-47e2-a6f1-4998820acc34") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.524906 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p9zz\" (UniqueName: \"kubernetes.io/projected/3ca155df-e0dd-42b3-92bf-98a7e8037f02-kube-api-access-8p9zz\") pod \"manila-operator-controller-manager-55f864c847-ggsnk\" (UID: \"3ca155df-e0dd-42b3-92bf-98a7e8037f02\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.526540 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9hbw\" (UniqueName: \"kubernetes.io/projected/84524c2d-5bf3-42ff-ac4e-7d5aea8c9772-kube-api-access-b9hbw\") pod \"mariadb-operator-controller-manager-67ccfc9778-zgd77\" (UID: \"84524c2d-5bf3-42ff-ac4e-7d5aea8c9772\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.526651 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4q8d\" (UniqueName: \"kubernetes.io/projected/3b1813e8-5330-4de2-ad05-1f56fcc1cfac-kube-api-access-w4q8d\") pod \"horizon-operator-controller-manager-8464cc45fb-c8w7p\" (UID: \"3b1813e8-5330-4de2-ad05-1f56fcc1cfac\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.527064 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tb8p\" (UniqueName: \"kubernetes.io/projected/edf58803-5b67-47e2-a6f1-4998820acc34-kube-api-access-6tb8p\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.527328 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.531455 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhnmq\" (UniqueName: \"kubernetes.io/projected/d61cd2c5-5321-418f-b506-a14210a24e95-kube-api-access-lhnmq\") pod \"ironic-operator-controller-manager-6f787dddc9-qck9t\" (UID: \"d61cd2c5-5321-418f-b506-a14210a24e95\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.536348 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkj97\" (UniqueName: \"kubernetes.io/projected/ccef47f8-908b-4765-abf1-d218024c98bf-kube-api-access-bkj97\") pod \"keystone-operator-controller-manager-768b96df4c-gqx5g\" (UID: \"ccef47f8-908b-4765-abf1-d218024c98bf\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.548747 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.549572 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.552684 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-q4v22" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.563039 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.564509 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.589131 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.600724 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.602073 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.605341 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.605409 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99j58\" (UniqueName: \"kubernetes.io/projected/b2a2462d-b0d7-43e7-a7ef-93a8a4e84113-kube-api-access-99j58\") pod \"neutron-operator-controller-manager-767865f676-8pcbl\" (UID: \"b2a2462d-b0d7-43e7-a7ef-93a8a4e84113\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.605433 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4ql\" (UniqueName: \"kubernetes.io/projected/22109be9-adb4-4573-9075-501c52043d47-kube-api-access-5g4ql\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.605463 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79h9k\" (UniqueName: \"kubernetes.io/projected/4ee9a64d-677e-4d45-8032-f33a4e91ee2c-kube-api-access-79h9k\") pod \"nova-operator-controller-manager-5d488d59fb-cqzxh\" (UID: \"4ee9a64d-677e-4d45-8032-f33a4e91ee2c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.605611 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8vbt\" (UniqueName: \"kubernetes.io/projected/87fb9e77-cea0-478a-8cd9-c8c15d42065e-kube-api-access-r8vbt\") pod \"octavia-operator-controller-manager-5b9f45d989-h5k2b\" (UID: \"87fb9e77-cea0-478a-8cd9-c8c15d42065e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.605696 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcpxp\" (UniqueName: \"kubernetes.io/projected/67a92119-db58-49a6-a7da-c63c135d6956-kube-api-access-lcpxp\") pod \"placement-operator-controller-manager-5784578c99-xg8mw\" (UID: \"67a92119-db58-49a6-a7da-c63c135d6956\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.605747 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gc65\" (UniqueName: \"kubernetes.io/projected/720d6c6c-aa5e-4fb3-b7a5-148224dff316-kube-api-access-7gc65\") pod \"ovn-operator-controller-manager-884679f54-qbcd4\" (UID: \"720d6c6c-aa5e-4fb3-b7a5-148224dff316\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.610714 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.613795 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-85rkx" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.619390 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.626931 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99j58\" (UniqueName: \"kubernetes.io/projected/b2a2462d-b0d7-43e7-a7ef-93a8a4e84113-kube-api-access-99j58\") pod \"neutron-operator-controller-manager-767865f676-8pcbl\" (UID: \"b2a2462d-b0d7-43e7-a7ef-93a8a4e84113\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.648909 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79h9k\" (UniqueName: \"kubernetes.io/projected/4ee9a64d-677e-4d45-8032-f33a4e91ee2c-kube-api-access-79h9k\") pod \"nova-operator-controller-manager-5d488d59fb-cqzxh\" (UID: \"4ee9a64d-677e-4d45-8032-f33a4e91ee2c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.649393 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8vbt\" (UniqueName: \"kubernetes.io/projected/87fb9e77-cea0-478a-8cd9-c8c15d42065e-kube-api-access-r8vbt\") pod \"octavia-operator-controller-manager-5b9f45d989-h5k2b\" (UID: \"87fb9e77-cea0-478a-8cd9-c8c15d42065e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.668544 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.688367 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.689455 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.690310 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.694222 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hqm2p" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.707259 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2596\" (UniqueName: \"kubernetes.io/projected/54f4f092-31d9-47b3-a791-b84c228f4024-kube-api-access-j2596\") pod \"telemetry-operator-controller-manager-d6b694c5-xtbhm\" (UID: \"54f4f092-31d9-47b3-a791-b84c228f4024\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.708497 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcpxp\" (UniqueName: \"kubernetes.io/projected/67a92119-db58-49a6-a7da-c63c135d6956-kube-api-access-lcpxp\") pod \"placement-operator-controller-manager-5784578c99-xg8mw\" (UID: \"67a92119-db58-49a6-a7da-c63c135d6956\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.708691 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gc65\" (UniqueName: \"kubernetes.io/projected/720d6c6c-aa5e-4fb3-b7a5-148224dff316-kube-api-access-7gc65\") pod \"ovn-operator-controller-manager-884679f54-qbcd4\" (UID: \"720d6c6c-aa5e-4fb3-b7a5-148224dff316\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.709008 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.709263 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6f5\" (UniqueName: \"kubernetes.io/projected/6d5f5fdf-8757-47c9-9c64-376367ba2bfb-kube-api-access-lx6f5\") pod \"swift-operator-controller-manager-c674c5965-pm9mf\" (UID: \"6d5f5fdf-8757-47c9-9c64-376367ba2bfb\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" Mar 20 15:41:18 crc kubenswrapper[4779]: E0320 15:41:18.709639 4779 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:18 crc kubenswrapper[4779]: E0320 15:41:18.709703 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert podName:22109be9-adb4-4573-9075-501c52043d47 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:19.20968158 +0000 UTC m=+1096.172197450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" (UID: "22109be9-adb4-4573-9075-501c52043d47") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.710269 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.713815 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4ql\" (UniqueName: \"kubernetes.io/projected/22109be9-adb4-4573-9075-501c52043d47-kube-api-access-5g4ql\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.728242 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.737836 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.744633 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gc65\" (UniqueName: \"kubernetes.io/projected/720d6c6c-aa5e-4fb3-b7a5-148224dff316-kube-api-access-7gc65\") pod \"ovn-operator-controller-manager-884679f54-qbcd4\" (UID: \"720d6c6c-aa5e-4fb3-b7a5-148224dff316\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.745019 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4ql\" (UniqueName: \"kubernetes.io/projected/22109be9-adb4-4573-9075-501c52043d47-kube-api-access-5g4ql\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.749282 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcpxp\" (UniqueName: \"kubernetes.io/projected/67a92119-db58-49a6-a7da-c63c135d6956-kube-api-access-lcpxp\") pod \"placement-operator-controller-manager-5784578c99-xg8mw\" (UID: \"67a92119-db58-49a6-a7da-c63c135d6956\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.769343 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.771680 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.775445 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4f66p" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.781167 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.815273 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t4mv\" (UniqueName: \"kubernetes.io/projected/a72e6f4d-3661-4b7e-92dc-7443c65304e6-kube-api-access-6t4mv\") pod \"test-operator-controller-manager-5c5cb9c4d7-7vzrl\" (UID: \"a72e6f4d-3661-4b7e-92dc-7443c65304e6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.815346 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6f5\" (UniqueName: \"kubernetes.io/projected/6d5f5fdf-8757-47c9-9c64-376367ba2bfb-kube-api-access-lx6f5\") pod \"swift-operator-controller-manager-c674c5965-pm9mf\" (UID: \"6d5f5fdf-8757-47c9-9c64-376367ba2bfb\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.815419 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2596\" (UniqueName: \"kubernetes.io/projected/54f4f092-31d9-47b3-a791-b84c228f4024-kube-api-access-j2596\") pod \"telemetry-operator-controller-manager-d6b694c5-xtbhm\" (UID: \"54f4f092-31d9-47b3-a791-b84c228f4024\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.828202 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.829215 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.832558 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.832704 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bdj2j" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.832799 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.844559 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.859014 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2596\" (UniqueName: \"kubernetes.io/projected/54f4f092-31d9-47b3-a791-b84c228f4024-kube-api-access-j2596\") pod \"telemetry-operator-controller-manager-d6b694c5-xtbhm\" (UID: \"54f4f092-31d9-47b3-a791-b84c228f4024\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.863521 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6f5\" (UniqueName: \"kubernetes.io/projected/6d5f5fdf-8757-47c9-9c64-376367ba2bfb-kube-api-access-lx6f5\") pod \"swift-operator-controller-manager-c674c5965-pm9mf\" (UID: \"6d5f5fdf-8757-47c9-9c64-376367ba2bfb\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.867083 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.878191 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.881740 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.882773 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf"] Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.884901 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mw99t" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.903468 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.950952 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24vl\" (UniqueName: \"kubernetes.io/projected/39438f44-0aea-412c-84e3-4d013dadd573-kube-api-access-q24vl\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.951139 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntrb\" (UniqueName: \"kubernetes.io/projected/f5288943-ed82-43b2-ab12-bab71febb2d9-kube-api-access-wntrb\") pod \"watcher-operator-controller-manager-78b4b86d76-x75t6\" (UID: \"f5288943-ed82-43b2-ab12-bab71febb2d9\") " pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.951190 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.951227 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.951261 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t4mv\" (UniqueName: \"kubernetes.io/projected/a72e6f4d-3661-4b7e-92dc-7443c65304e6-kube-api-access-6t4mv\") pod \"test-operator-controller-manager-5c5cb9c4d7-7vzrl\" (UID: \"a72e6f4d-3661-4b7e-92dc-7443c65304e6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.954659 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.956196 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" Mar 20 15:41:18 crc kubenswrapper[4779]: I0320 15:41:18.982170 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t4mv\" (UniqueName: \"kubernetes.io/projected/a72e6f4d-3661-4b7e-92dc-7443c65304e6-kube-api-access-6t4mv\") pod \"test-operator-controller-manager-5c5cb9c4d7-7vzrl\" (UID: \"a72e6f4d-3661-4b7e-92dc-7443c65304e6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.020756 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.053600 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wntrb\" (UniqueName: \"kubernetes.io/projected/f5288943-ed82-43b2-ab12-bab71febb2d9-kube-api-access-wntrb\") pod \"watcher-operator-controller-manager-78b4b86d76-x75t6\" (UID: \"f5288943-ed82-43b2-ab12-bab71febb2d9\") " pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.053658 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.053683 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.053759 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wn27\" (UniqueName: \"kubernetes.io/projected/7de4ed63-6a6f-420f-a769-eeafd4a87eef-kube-api-access-4wn27\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mq9mf\" (UID: \"7de4ed63-6a6f-420f-a769-eeafd4a87eef\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.053802 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q24vl\" (UniqueName: \"kubernetes.io/projected/39438f44-0aea-412c-84e3-4d013dadd573-kube-api-access-q24vl\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.053830 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.053951 4779 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.054027 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert podName:edf58803-5b67-47e2-a6f1-4998820acc34 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:20.053993275 +0000 UTC m=+1097.016509075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert") pod "infra-operator-controller-manager-7b9c774f96-n5x2d" (UID: "edf58803-5b67-47e2-a6f1-4998820acc34") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.054693 4779 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.054736 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:19.554724882 +0000 UTC m=+1096.517240682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "webhook-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.054783 4779 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.054815 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:19.554802734 +0000 UTC m=+1096.517318544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "metrics-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.082828 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24vl\" (UniqueName: \"kubernetes.io/projected/39438f44-0aea-412c-84e3-4d013dadd573-kube-api-access-q24vl\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.093129 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.097101 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntrb\" (UniqueName: \"kubernetes.io/projected/f5288943-ed82-43b2-ab12-bab71febb2d9-kube-api-access-wntrb\") pod \"watcher-operator-controller-manager-78b4b86d76-x75t6\" (UID: \"f5288943-ed82-43b2-ab12-bab71febb2d9\") " pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.154517 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn27\" (UniqueName: \"kubernetes.io/projected/7de4ed63-6a6f-420f-a769-eeafd4a87eef-kube-api-access-4wn27\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mq9mf\" (UID: \"7de4ed63-6a6f-420f-a769-eeafd4a87eef\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.156385 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fd922a6_0fbd_4458_b83c_b599e2988c7a.slice/crio-6f35f714d2ea248a25ca8c5ee4a8530f88964a2b255923104183ee96da56fd4c WatchSource:0}: Error finding container 6f35f714d2ea248a25ca8c5ee4a8530f88964a2b255923104183ee96da56fd4c: Status 404 returned error can't find the container with id 6f35f714d2ea248a25ca8c5ee4a8530f88964a2b255923104183ee96da56fd4c Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.163736 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.177853 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wn27\" (UniqueName: \"kubernetes.io/projected/7de4ed63-6a6f-420f-a769-eeafd4a87eef-kube-api-access-4wn27\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mq9mf\" (UID: \"7de4ed63-6a6f-420f-a769-eeafd4a87eef\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.211338 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.256925 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.257074 4779 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.257151 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert podName:22109be9-adb4-4573-9075-501c52043d47 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:20.257133405 +0000 UTC m=+1097.219649205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" (UID: "22109be9-adb4-4573-9075-501c52043d47") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.279100 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6"] Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.290189 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e44c59_8e80_4e8c_956f_10b8091f819f.slice/crio-d7889df4a73a3b46e01eb985e69037938bc2c939af93b6b425ff9c18a283f504 WatchSource:0}: Error finding container d7889df4a73a3b46e01eb985e69037938bc2c939af93b6b425ff9c18a283f504: Status 404 returned error can't find the container with id d7889df4a73a3b46e01eb985e69037938bc2c939af93b6b425ff9c18a283f504 Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.380953 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod069366bd_aca3_43b9_9335_7d98fb20d4b7.slice/crio-6dedb41f8f57d9e320963b84717ae4e8772767f4b26bb598440b7eff3619339e WatchSource:0}: Error finding container 6dedb41f8f57d9e320963b84717ae4e8772767f4b26bb598440b7eff3619339e: Status 404 returned error can't find the container with id 6dedb41f8f57d9e320963b84717ae4e8772767f4b26bb598440b7eff3619339e Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.382883 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddee98c8_526f_459c_a9eb_ea96bc062ff5.slice/crio-8e5b5b0280076c91bde20e78fd0b4838745d00ad9982af8aa91fb2fd52623260 WatchSource:0}: Error finding container 8e5b5b0280076c91bde20e78fd0b4838745d00ad9982af8aa91fb2fd52623260: Status 404 returned error can't find the container with id 8e5b5b0280076c91bde20e78fd0b4838745d00ad9982af8aa91fb2fd52623260 Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.385895 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.400127 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.571842 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.571895 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.572056 4779 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.572136 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:20.572097787 +0000 UTC m=+1097.534613587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "metrics-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.572248 4779 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.572317 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:20.572298612 +0000 UTC m=+1097.534814412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "webhook-server-cert" not found Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.572877 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod679a5132_11e9_477f_954d_1eb244f67d9c.slice/crio-2aa24c2df118af11662397e15d34ad354d91c026bac904de7e111186e4bd49cb WatchSource:0}: Error finding container 2aa24c2df118af11662397e15d34ad354d91c026bac904de7e111186e4bd49cb: Status 404 returned error can't find the container with id 2aa24c2df118af11662397e15d34ad354d91c026bac904de7e111186e4bd49cb Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.574029 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.580546 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.588229 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77"] Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.588293 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b1813e8_5330_4de2_ad05_1f56fcc1cfac.slice/crio-bfe846a73abf07c1fbf66bfdb91091ed75c0a458bc3907b678ea35adc0932ff1 WatchSource:0}: Error finding container bfe846a73abf07c1fbf66bfdb91091ed75c0a458bc3907b678ea35adc0932ff1: Status 404 returned error can't find the container with id bfe846a73abf07c1fbf66bfdb91091ed75c0a458bc3907b678ea35adc0932ff1 Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.753572 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.771771 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.802477 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.809478 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4"] Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.809721 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod720d6c6c_aa5e_4fb3_b7a5_148224dff316.slice/crio-be3e3e5ceb2c953ce7d0f7e7b317b7fb3e64b22e18632f349fe70e42eb9df8ae WatchSource:0}: Error finding container be3e3e5ceb2c953ce7d0f7e7b317b7fb3e64b22e18632f349fe70e42eb9df8ae: Status 404 returned error can't find the container with id be3e3e5ceb2c953ce7d0f7e7b317b7fb3e64b22e18632f349fe70e42eb9df8ae Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.824702 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a2462d_b0d7_43e7_a7ef_93a8a4e84113.slice/crio-8d536d9584f8dcfcda9d1a2a3f43566d485a00d289cd34d453935251c5b9e830 WatchSource:0}: Error finding container 8d536d9584f8dcfcda9d1a2a3f43566d485a00d289cd34d453935251c5b9e830: Status 404 returned error can't find the container with id 8d536d9584f8dcfcda9d1a2a3f43566d485a00d289cd34d453935251c5b9e830 Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.831498 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh"] Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.846719 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkj97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-gqx5g_openstack-operators(ccef47f8-908b-4765-abf1-d218024c98bf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.847664 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lcpxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-xg8mw_openstack-operators(67a92119-db58-49a6-a7da-c63c135d6956): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.849420 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" podUID="67a92119-db58-49a6-a7da-c63c135d6956" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.849471 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" podUID="ccef47f8-908b-4765-abf1-d218024c98bf" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.858638 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.866358 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.871037 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw"] Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.934032 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf"] Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.938258 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5288943_ed82_43b2_ab12_bab71febb2d9.slice/crio-5040ee8c1fedf7b5272aa628e8476320b01d7cbe5611d96752491298218fbc8a WatchSource:0}: Error finding container 5040ee8c1fedf7b5272aa628e8476320b01d7cbe5611d96752491298218fbc8a: Status 404 returned error can't find the container with id 5040ee8c1fedf7b5272aa628e8476320b01d7cbe5611d96752491298218fbc8a Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.939633 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf"] Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.943519 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.222:5001/openstack-k8s-operators/watcher-operator:ee00c2d330b27d46c48ac29a20680b56ca50df3c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wntrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-78b4b86d76-x75t6_openstack-operators(f5288943-ed82-43b2-ab12-bab71febb2d9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.944717 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" podUID="f5288943-ed82-43b2-ab12-bab71febb2d9" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.944996 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lx6f5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-pm9mf_openstack-operators(6d5f5fdf-8757-47c9-9c64-376367ba2bfb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.945082 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6"] Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.946097 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" podUID="6d5f5fdf-8757-47c9-9c64-376367ba2bfb" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.948925 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6t4mv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-7vzrl_openstack-operators(a72e6f4d-3661-4b7e-92dc-7443c65304e6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.949364 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4wn27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mq9mf_openstack-operators(7de4ed63-6a6f-420f-a769-eeafd4a87eef): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.950064 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" podUID="a72e6f4d-3661-4b7e-92dc-7443c65304e6" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.950908 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" podUID="7de4ed63-6a6f-420f-a769-eeafd4a87eef" Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.951774 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl"] Mar 20 15:41:19 crc kubenswrapper[4779]: W0320 15:41:19.959703 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54f4f092_31d9_47b3_a791_b84c228f4024.slice/crio-c23131fd99ff52d5cd6adec19a0593302482c515a8776ffcd575f60ec7613411 WatchSource:0}: Error finding container c23131fd99ff52d5cd6adec19a0593302482c515a8776ffcd575f60ec7613411: Status 404 returned error can't find the container with id c23131fd99ff52d5cd6adec19a0593302482c515a8776ffcd575f60ec7613411 Mar 20 15:41:19 crc kubenswrapper[4779]: I0320 15:41:19.961663 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm"] Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.962405 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j2596,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-xtbhm_openstack-operators(54f4f092-31d9-47b3-a791-b84c228f4024): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:41:19 crc kubenswrapper[4779]: E0320 15:41:19.963525 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" podUID="54f4f092-31d9-47b3-a791-b84c228f4024" Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.027597 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" event={"ID":"d61cd2c5-5321-418f-b506-a14210a24e95","Type":"ContainerStarted","Data":"595a0b2e331fc596ac5802ca32366f4b538edf20d2dba6851171fc3f0013a1a7"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.035232 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" event={"ID":"c3e44c59-8e80-4e8c-956f-10b8091f819f","Type":"ContainerStarted","Data":"d7889df4a73a3b46e01eb985e69037938bc2c939af93b6b425ff9c18a283f504"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.056921 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" event={"ID":"54f4f092-31d9-47b3-a791-b84c228f4024","Type":"ContainerStarted","Data":"c23131fd99ff52d5cd6adec19a0593302482c515a8776ffcd575f60ec7613411"} Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.061339 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" podUID="54f4f092-31d9-47b3-a791-b84c228f4024" Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.065207 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" event={"ID":"84524c2d-5bf3-42ff-ac4e-7d5aea8c9772","Type":"ContainerStarted","Data":"c32f4b2289bd257fa42e925b070cea744de6b65d6f8ecda05197ea21501327f3"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.083231 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.083429 4779 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.083482 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert podName:edf58803-5b67-47e2-a6f1-4998820acc34 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:22.083467592 +0000 UTC m=+1099.045983392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert") pod "infra-operator-controller-manager-7b9c774f96-n5x2d" (UID: "edf58803-5b67-47e2-a6f1-4998820acc34") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.091814 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" event={"ID":"679a5132-11e9-477f-954d-1eb244f67d9c","Type":"ContainerStarted","Data":"2aa24c2df118af11662397e15d34ad354d91c026bac904de7e111186e4bd49cb"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.120735 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" event={"ID":"3ca155df-e0dd-42b3-92bf-98a7e8037f02","Type":"ContainerStarted","Data":"01a1bb02f35da43bf06cc2d07bf3af7dd5fd1812856738001b72c515dcd00f49"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.133798 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" event={"ID":"720d6c6c-aa5e-4fb3-b7a5-148224dff316","Type":"ContainerStarted","Data":"be3e3e5ceb2c953ce7d0f7e7b317b7fb3e64b22e18632f349fe70e42eb9df8ae"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.135338 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" event={"ID":"3fd922a6-0fbd-4458-b83c-b599e2988c7a","Type":"ContainerStarted","Data":"6f35f714d2ea248a25ca8c5ee4a8530f88964a2b255923104183ee96da56fd4c"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.136036 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" event={"ID":"7de4ed63-6a6f-420f-a769-eeafd4a87eef","Type":"ContainerStarted","Data":"059862c5e900ac49db1ccada4c660dbf5e184863d2f755ddc74601be7a2af923"} Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.141412 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" podUID="7de4ed63-6a6f-420f-a769-eeafd4a87eef" Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.146426 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" event={"ID":"b2a2462d-b0d7-43e7-a7ef-93a8a4e84113","Type":"ContainerStarted","Data":"8d536d9584f8dcfcda9d1a2a3f43566d485a00d289cd34d453935251c5b9e830"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.160800 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" event={"ID":"67a92119-db58-49a6-a7da-c63c135d6956","Type":"ContainerStarted","Data":"e764c546c25059d86c640527af2b9d9c2a77a3e3efb365ded8d93af525fc8c30"} Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.162353 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" podUID="67a92119-db58-49a6-a7da-c63c135d6956" Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.162698 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" event={"ID":"ddee98c8-526f-459c-a9eb-ea96bc062ff5","Type":"ContainerStarted","Data":"8e5b5b0280076c91bde20e78fd0b4838745d00ad9982af8aa91fb2fd52623260"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.175067 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" event={"ID":"f5288943-ed82-43b2-ab12-bab71febb2d9","Type":"ContainerStarted","Data":"5040ee8c1fedf7b5272aa628e8476320b01d7cbe5611d96752491298218fbc8a"} Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.176884 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.222:5001/openstack-k8s-operators/watcher-operator:ee00c2d330b27d46c48ac29a20680b56ca50df3c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" podUID="f5288943-ed82-43b2-ab12-bab71febb2d9" Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.177003 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" event={"ID":"87fb9e77-cea0-478a-8cd9-c8c15d42065e","Type":"ContainerStarted","Data":"073f27cdf902688988e59596f5600b6ae54d502d0a1934b48672cc6862df4e6b"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.179280 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" event={"ID":"6d5f5fdf-8757-47c9-9c64-376367ba2bfb","Type":"ContainerStarted","Data":"1a92f8259bad8574d2d14f34f7bf5202f30dbb9b40efc743b3db427aaabc9d9e"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.180265 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" event={"ID":"3b1813e8-5330-4de2-ad05-1f56fcc1cfac","Type":"ContainerStarted","Data":"bfe846a73abf07c1fbf66bfdb91091ed75c0a458bc3907b678ea35adc0932ff1"} Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.180593 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" podUID="6d5f5fdf-8757-47c9-9c64-376367ba2bfb" Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.181702 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" event={"ID":"069366bd-aca3-43b9-9335-7d98fb20d4b7","Type":"ContainerStarted","Data":"6dedb41f8f57d9e320963b84717ae4e8772767f4b26bb598440b7eff3619339e"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.183321 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" event={"ID":"4ee9a64d-677e-4d45-8032-f33a4e91ee2c","Type":"ContainerStarted","Data":"4af353f2632edfd081a914b2ca70bb0c846b70d4d279fbe52cc8942f28359772"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.185014 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" event={"ID":"ccef47f8-908b-4765-abf1-d218024c98bf","Type":"ContainerStarted","Data":"3033efabcd91389a059c6d70b77df0ee2e92b69674767417402717056fd20df7"} Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.186033 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" event={"ID":"a72e6f4d-3661-4b7e-92dc-7443c65304e6","Type":"ContainerStarted","Data":"6436012f996378b5df28a54893089c4744e239c31d5cf7f30dec6baa04fbc40c"} Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.186522 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" podUID="ccef47f8-908b-4765-abf1-d218024c98bf" Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.187453 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" podUID="a72e6f4d-3661-4b7e-92dc-7443c65304e6" Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.286691 4779 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.286763 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert podName:22109be9-adb4-4573-9075-501c52043d47 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:22.286743576 +0000 UTC m=+1099.249259376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" (UID: "22109be9-adb4-4573-9075-501c52043d47") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.286525 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.597773 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:20 crc kubenswrapper[4779]: I0320 15:41:20.597822 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.597943 4779 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.597978 4779 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.598015 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:22.598001955 +0000 UTC m=+1099.560517755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "metrics-server-cert" not found Mar 20 15:41:20 crc kubenswrapper[4779]: E0320 15:41:20.598035 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:22.598029895 +0000 UTC m=+1099.560545695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "webhook-server-cert" not found Mar 20 15:41:21 crc kubenswrapper[4779]: E0320 15:41:21.205768 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" podUID="67a92119-db58-49a6-a7da-c63c135d6956" Mar 20 15:41:21 crc kubenswrapper[4779]: E0320 15:41:21.205895 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.222:5001/openstack-k8s-operators/watcher-operator:ee00c2d330b27d46c48ac29a20680b56ca50df3c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" podUID="f5288943-ed82-43b2-ab12-bab71febb2d9" Mar 20 15:41:21 crc kubenswrapper[4779]: E0320 15:41:21.205948 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" podUID="6d5f5fdf-8757-47c9-9c64-376367ba2bfb" Mar 20 15:41:21 crc kubenswrapper[4779]: E0320 15:41:21.206154 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" podUID="ccef47f8-908b-4765-abf1-d218024c98bf" Mar 20 15:41:21 crc kubenswrapper[4779]: E0320 15:41:21.206228 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" podUID="a72e6f4d-3661-4b7e-92dc-7443c65304e6" Mar 20 15:41:21 crc kubenswrapper[4779]: E0320 15:41:21.206448 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" podUID="7de4ed63-6a6f-420f-a769-eeafd4a87eef" Mar 20 15:41:21 crc kubenswrapper[4779]: E0320 15:41:21.206528 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" podUID="54f4f092-31d9-47b3-a791-b84c228f4024" Mar 20 15:41:22 crc kubenswrapper[4779]: I0320 15:41:22.152928 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:22 crc kubenswrapper[4779]: E0320 15:41:22.153178 4779 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:22 crc kubenswrapper[4779]: E0320 15:41:22.153246 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert podName:edf58803-5b67-47e2-a6f1-4998820acc34 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:26.153228666 +0000 UTC m=+1103.115744466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert") pod "infra-operator-controller-manager-7b9c774f96-n5x2d" (UID: "edf58803-5b67-47e2-a6f1-4998820acc34") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:22 crc kubenswrapper[4779]: I0320 15:41:22.354762 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:22 crc kubenswrapper[4779]: E0320 15:41:22.354964 4779 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:22 crc kubenswrapper[4779]: E0320 15:41:22.355007 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert podName:22109be9-adb4-4573-9075-501c52043d47 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:26.354994772 +0000 UTC m=+1103.317510572 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" (UID: "22109be9-adb4-4573-9075-501c52043d47") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:22 crc kubenswrapper[4779]: I0320 15:41:22.809542 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:22 crc kubenswrapper[4779]: I0320 15:41:22.809600 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:22 crc kubenswrapper[4779]: E0320 15:41:22.809822 4779 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:41:22 crc kubenswrapper[4779]: E0320 15:41:22.809876 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:26.809859457 +0000 UTC m=+1103.772375257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "metrics-server-cert" not found Mar 20 15:41:22 crc kubenswrapper[4779]: E0320 15:41:22.809932 4779 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:41:22 crc kubenswrapper[4779]: E0320 15:41:22.809960 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:26.809951579 +0000 UTC m=+1103.772467379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "webhook-server-cert" not found Mar 20 15:41:26 crc kubenswrapper[4779]: I0320 15:41:26.157363 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:26 crc kubenswrapper[4779]: E0320 15:41:26.157525 4779 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:26 crc kubenswrapper[4779]: E0320 15:41:26.157852 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert podName:edf58803-5b67-47e2-a6f1-4998820acc34 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:34.157834438 +0000 UTC m=+1111.120350238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert") pod "infra-operator-controller-manager-7b9c774f96-n5x2d" (UID: "edf58803-5b67-47e2-a6f1-4998820acc34") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:26 crc kubenswrapper[4779]: I0320 15:41:26.359819 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:26 crc kubenswrapper[4779]: E0320 15:41:26.359980 4779 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:26 crc kubenswrapper[4779]: E0320 15:41:26.360048 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert podName:22109be9-adb4-4573-9075-501c52043d47 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:34.360030596 +0000 UTC m=+1111.322546386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" (UID: "22109be9-adb4-4573-9075-501c52043d47") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:26 crc kubenswrapper[4779]: I0320 15:41:26.880357 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:26 crc kubenswrapper[4779]: I0320 15:41:26.880414 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:26 crc kubenswrapper[4779]: E0320 15:41:26.880504 4779 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:41:26 crc kubenswrapper[4779]: E0320 15:41:26.880502 4779 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:41:26 crc kubenswrapper[4779]: E0320 15:41:26.880558 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:34.880544435 +0000 UTC m=+1111.843060235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "metrics-server-cert" not found Mar 20 15:41:26 crc kubenswrapper[4779]: E0320 15:41:26.880571 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:34.880566546 +0000 UTC m=+1111.843082346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "webhook-server-cert" not found Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.292416 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" event={"ID":"3b1813e8-5330-4de2-ad05-1f56fcc1cfac","Type":"ContainerStarted","Data":"5ee94bd6baf142420db92f07a831eb182b4a0e64c7f7e08c93c9b6f2fe96b4a1"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.293881 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.293973 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.294068 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" event={"ID":"ddee98c8-526f-459c-a9eb-ea96bc062ff5","Type":"ContainerStarted","Data":"513dbfe8a08d9b00aeb956bf040e3c98e195dcbebcb4de7338d2bc2473ec44c3"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.294905 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" event={"ID":"d61cd2c5-5321-418f-b506-a14210a24e95","Type":"ContainerStarted","Data":"79db12fa6bc8e017dc427a82917d9ad25e6a3175f2584109612d3a6c85a90588"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.296328 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" event={"ID":"84524c2d-5bf3-42ff-ac4e-7d5aea8c9772","Type":"ContainerStarted","Data":"754b6bdc376e1c8c49d4cbe90aa4e45e6825d57bcc02deaab32110f85b3a1c82"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.296425 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.297420 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" event={"ID":"87fb9e77-cea0-478a-8cd9-c8c15d42065e","Type":"ContainerStarted","Data":"ea9f923e813c91bc906c559af21044da38f49e5e2f301f9502ab54fcf02cbb67"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.297570 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.298504 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" event={"ID":"3ca155df-e0dd-42b3-92bf-98a7e8037f02","Type":"ContainerStarted","Data":"e81087eed606e134019fe8a000c9912446a444d441845763ce9cebfa57c4ea2c"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.298916 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.299902 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" event={"ID":"3fd922a6-0fbd-4458-b83c-b599e2988c7a","Type":"ContainerStarted","Data":"dedfc23f39f7eeeaec707cf033ad6eff37efad0031575594e2bc78ed38623533"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.300321 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.301226 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" event={"ID":"069366bd-aca3-43b9-9335-7d98fb20d4b7","Type":"ContainerStarted","Data":"dcae2aa9d39c0a6ba3382ec52ea7155ae68cc9913f1daedb3a71f4cf0fee09a2"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.301636 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.302541 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" event={"ID":"b2a2462d-b0d7-43e7-a7ef-93a8a4e84113","Type":"ContainerStarted","Data":"56564bf6139466f7f9355031eed7253e4ceb375a1e9b3edb22f0fb3abfa7a2ac"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.303429 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.304376 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" event={"ID":"4ee9a64d-677e-4d45-8032-f33a4e91ee2c","Type":"ContainerStarted","Data":"828502a1adb1a0f0da23a8189a2a7c59a2f7b2497c1516db3356f7805c58e70a"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.304782 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.306207 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" event={"ID":"c3e44c59-8e80-4e8c-956f-10b8091f819f","Type":"ContainerStarted","Data":"c11b18bfc4fc4782660d3bb107105309ec569e2c6e22910206813e0b250df69b"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.306638 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.311744 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" event={"ID":"679a5132-11e9-477f-954d-1eb244f67d9c","Type":"ContainerStarted","Data":"ec62e60c6daadf107cbcc9624bf66d32e525d28467cc82c896e54823ab51a903"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.311812 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.312586 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" podStartSLOduration=2.683169485 podStartE2EDuration="13.312574515s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.59227146 +0000 UTC m=+1096.554787260" lastFinishedPulling="2026-03-20 15:41:30.22167647 +0000 UTC m=+1107.184192290" observedRunningTime="2026-03-20 15:41:31.310507901 +0000 UTC m=+1108.273023701" watchObservedRunningTime="2026-03-20 15:41:31.312574515 +0000 UTC m=+1108.275090315" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.312863 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" event={"ID":"720d6c6c-aa5e-4fb3-b7a5-148224dff316","Type":"ContainerStarted","Data":"14a20f3d80231d26e6fc9c6545bd7a019eced6191994647cffc082d8c4706c63"} Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.313328 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.362991 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" podStartSLOduration=2.491636658 podStartE2EDuration="13.362974273s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.291910672 +0000 UTC m=+1096.254426472" lastFinishedPulling="2026-03-20 15:41:30.163248287 +0000 UTC m=+1107.125764087" observedRunningTime="2026-03-20 15:41:31.34331361 +0000 UTC m=+1108.305829410" watchObservedRunningTime="2026-03-20 15:41:31.362974273 +0000 UTC m=+1108.325490073" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.393303 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" podStartSLOduration=2.381036881 podStartE2EDuration="13.393288687s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.176093582 +0000 UTC m=+1096.138609382" lastFinishedPulling="2026-03-20 15:41:30.188345388 +0000 UTC m=+1107.150861188" observedRunningTime="2026-03-20 15:41:31.389527221 +0000 UTC m=+1108.352043021" watchObservedRunningTime="2026-03-20 15:41:31.393288687 +0000 UTC m=+1108.355804487" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.396287 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" podStartSLOduration=2.615967323 podStartE2EDuration="13.396277134s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.38352067 +0000 UTC m=+1096.346036470" lastFinishedPulling="2026-03-20 15:41:30.163830481 +0000 UTC m=+1107.126346281" observedRunningTime="2026-03-20 15:41:31.368584936 +0000 UTC m=+1108.331100736" watchObservedRunningTime="2026-03-20 15:41:31.396277134 +0000 UTC m=+1108.358792934" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.409337 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" podStartSLOduration=3.005707653 podStartE2EDuration="13.409318087s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.769753761 +0000 UTC m=+1096.732269551" lastFinishedPulling="2026-03-20 15:41:30.173364185 +0000 UTC m=+1107.135879985" observedRunningTime="2026-03-20 15:41:31.404647888 +0000 UTC m=+1108.367163678" watchObservedRunningTime="2026-03-20 15:41:31.409318087 +0000 UTC m=+1108.371833887" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.425766 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" podStartSLOduration=3.069429275 podStartE2EDuration="13.425751887s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.830423555 +0000 UTC m=+1096.792939355" lastFinishedPulling="2026-03-20 15:41:30.186746167 +0000 UTC m=+1107.149261967" observedRunningTime="2026-03-20 15:41:31.42352847 +0000 UTC m=+1108.386044260" watchObservedRunningTime="2026-03-20 15:41:31.425751887 +0000 UTC m=+1108.388267687" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.454779 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" podStartSLOduration=2.881134434 podStartE2EDuration="13.454761067s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.589663785 +0000 UTC m=+1096.552179585" lastFinishedPulling="2026-03-20 15:41:30.163290418 +0000 UTC m=+1107.125806218" observedRunningTime="2026-03-20 15:41:31.449572396 +0000 UTC m=+1108.412088196" watchObservedRunningTime="2026-03-20 15:41:31.454761067 +0000 UTC m=+1108.417276867" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.479862 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" podStartSLOduration=2.74852257 podStartE2EDuration="13.479847379s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.398879583 +0000 UTC m=+1096.361395383" lastFinishedPulling="2026-03-20 15:41:30.130204392 +0000 UTC m=+1107.092720192" observedRunningTime="2026-03-20 15:41:31.476267158 +0000 UTC m=+1108.438782958" watchObservedRunningTime="2026-03-20 15:41:31.479847379 +0000 UTC m=+1108.442363169" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.498737 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" podStartSLOduration=3.106609875 podStartE2EDuration="13.498716501s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.795466652 +0000 UTC m=+1096.757982452" lastFinishedPulling="2026-03-20 15:41:30.187573278 +0000 UTC m=+1107.150089078" observedRunningTime="2026-03-20 15:41:31.493093418 +0000 UTC m=+1108.455609208" watchObservedRunningTime="2026-03-20 15:41:31.498716501 +0000 UTC m=+1108.461232301" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.513172 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" podStartSLOduration=3.17018576 podStartE2EDuration="13.51315553s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.844995168 +0000 UTC m=+1096.807510968" lastFinishedPulling="2026-03-20 15:41:30.187964938 +0000 UTC m=+1107.150480738" observedRunningTime="2026-03-20 15:41:31.511036816 +0000 UTC m=+1108.473552636" watchObservedRunningTime="2026-03-20 15:41:31.51315553 +0000 UTC m=+1108.475671330" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.539869 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" podStartSLOduration=3.225461786 podStartE2EDuration="13.539852932s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.840398364 +0000 UTC m=+1096.802914164" lastFinishedPulling="2026-03-20 15:41:30.15478951 +0000 UTC m=+1107.117305310" observedRunningTime="2026-03-20 15:41:31.530313058 +0000 UTC m=+1108.492828858" watchObservedRunningTime="2026-03-20 15:41:31.539852932 +0000 UTC m=+1108.502368732" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.555280 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" podStartSLOduration=2.919638968 podStartE2EDuration="13.555264006s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.575386969 +0000 UTC m=+1096.537902769" lastFinishedPulling="2026-03-20 15:41:30.211012007 +0000 UTC m=+1107.173527807" observedRunningTime="2026-03-20 15:41:31.553985943 +0000 UTC m=+1108.516501743" watchObservedRunningTime="2026-03-20 15:41:31.555264006 +0000 UTC m=+1108.517779806" Mar 20 15:41:31 crc kubenswrapper[4779]: I0320 15:41:31.581867 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" podStartSLOduration=3.187710525 podStartE2EDuration="13.581846355s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.816158609 +0000 UTC m=+1096.778674409" lastFinishedPulling="2026-03-20 15:41:30.210294439 +0000 UTC m=+1107.172810239" observedRunningTime="2026-03-20 15:41:31.577943635 +0000 UTC m=+1108.540459435" watchObservedRunningTime="2026-03-20 15:41:31.581846355 +0000 UTC m=+1108.544362155" Mar 20 15:41:32 crc kubenswrapper[4779]: I0320 15:41:32.320954 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" Mar 20 15:41:34 crc kubenswrapper[4779]: I0320 15:41:34.197609 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:34 crc kubenswrapper[4779]: E0320 15:41:34.197784 4779 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:34 crc kubenswrapper[4779]: E0320 15:41:34.198218 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert podName:edf58803-5b67-47e2-a6f1-4998820acc34 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:50.198197849 +0000 UTC m=+1127.160713669 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert") pod "infra-operator-controller-manager-7b9c774f96-n5x2d" (UID: "edf58803-5b67-47e2-a6f1-4998820acc34") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:41:34 crc kubenswrapper[4779]: I0320 15:41:34.400097 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:34 crc kubenswrapper[4779]: E0320 15:41:34.400259 4779 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:34 crc kubenswrapper[4779]: E0320 15:41:34.400307 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert podName:22109be9-adb4-4573-9075-501c52043d47 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:50.400291523 +0000 UTC m=+1127.362807323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" (UID: "22109be9-adb4-4573-9075-501c52043d47") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:41:34 crc kubenswrapper[4779]: I0320 15:41:34.907005 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:34 crc kubenswrapper[4779]: I0320 15:41:34.907066 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:34 crc kubenswrapper[4779]: E0320 15:41:34.908071 4779 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:41:34 crc kubenswrapper[4779]: E0320 15:41:34.908156 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:50.908137361 +0000 UTC m=+1127.870653161 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "webhook-server-cert" not found Mar 20 15:41:34 crc kubenswrapper[4779]: E0320 15:41:34.908533 4779 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:41:34 crc kubenswrapper[4779]: E0320 15:41:34.908576 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs podName:39438f44-0aea-412c-84e3-4d013dadd573 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:50.908564851 +0000 UTC m=+1127.871080651 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs") pod "openstack-operator-controller-manager-6b5784dbfc-h2nhw" (UID: "39438f44-0aea-412c-84e3-4d013dadd573") : secret "metrics-server-cert" not found Mar 20 15:41:35 crc kubenswrapper[4779]: I0320 15:41:35.343570 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" event={"ID":"a72e6f4d-3661-4b7e-92dc-7443c65304e6","Type":"ContainerStarted","Data":"3321f411f14c8677703ece43265096131572962ad279603686452766af41a8fd"} Mar 20 15:41:35 crc kubenswrapper[4779]: I0320 15:41:35.344394 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" Mar 20 15:41:35 crc kubenswrapper[4779]: I0320 15:41:35.364210 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" podStartSLOduration=2.867598149 podStartE2EDuration="17.364176443s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.94877828 +0000 UTC m=+1096.911294080" lastFinishedPulling="2026-03-20 15:41:34.445356574 +0000 UTC m=+1111.407872374" observedRunningTime="2026-03-20 15:41:35.358037017 +0000 UTC m=+1112.320552827" watchObservedRunningTime="2026-03-20 15:41:35.364176443 +0000 UTC m=+1112.326692243" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.413627 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t5d5w" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.423703 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t6fr6" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.440482 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5tdlb" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.463835 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-pt5hw" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.497624 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-qgxj7" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.530666 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-c8w7p" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.567556 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qck9t" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.613920 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ggsnk" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.671328 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-zgd77" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.695788 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-8pcbl" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.719177 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-cqzxh" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.744737 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-h5k2b" Mar 20 15:41:38 crc kubenswrapper[4779]: I0320 15:41:38.871302 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qbcd4" Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.023513 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7vzrl" Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.373181 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" event={"ID":"54f4f092-31d9-47b3-a791-b84c228f4024","Type":"ContainerStarted","Data":"ead54b529d8669319629a47ec5cec81c38151e8c7020778ee6b99f9aa653249b"} Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.373397 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.375090 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" event={"ID":"6d5f5fdf-8757-47c9-9c64-376367ba2bfb","Type":"ContainerStarted","Data":"45e0975fe0ae0f7132fd084df59878843c0f4946b548ea44ec560a40644e3a27"} Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.375234 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.377558 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" event={"ID":"7de4ed63-6a6f-420f-a769-eeafd4a87eef","Type":"ContainerStarted","Data":"8d9c15a73f43700d21c6f0e0b0366c4169340977689ac9aded60923859733520"} Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.378952 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" event={"ID":"67a92119-db58-49a6-a7da-c63c135d6956","Type":"ContainerStarted","Data":"d74e81b3e1e015e87c5025bf8307b0ace5406a0873191d2ad3ebc68122b30340"} Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.379146 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.393317 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" podStartSLOduration=2.49699927 podStartE2EDuration="21.393299588s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.962249185 +0000 UTC m=+1096.924764985" lastFinishedPulling="2026-03-20 15:41:38.858549503 +0000 UTC m=+1115.821065303" observedRunningTime="2026-03-20 15:41:39.390298881 +0000 UTC m=+1116.352814681" watchObservedRunningTime="2026-03-20 15:41:39.393299588 +0000 UTC m=+1116.355815388" Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.404074 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mq9mf" podStartSLOduration=2.48102579 podStartE2EDuration="21.404054473s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.949240171 +0000 UTC m=+1096.911755971" lastFinishedPulling="2026-03-20 15:41:38.872268854 +0000 UTC m=+1115.834784654" observedRunningTime="2026-03-20 15:41:39.404042602 +0000 UTC m=+1116.366558402" watchObservedRunningTime="2026-03-20 15:41:39.404054473 +0000 UTC m=+1116.366570273" Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.420446 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" podStartSLOduration=2.526830692 podStartE2EDuration="21.4204271s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.944854912 +0000 UTC m=+1096.907370712" lastFinishedPulling="2026-03-20 15:41:38.83845133 +0000 UTC m=+1115.800967120" observedRunningTime="2026-03-20 15:41:39.41804449 +0000 UTC m=+1116.380560300" watchObservedRunningTime="2026-03-20 15:41:39.4204271 +0000 UTC m=+1116.382942900" Mar 20 15:41:39 crc kubenswrapper[4779]: I0320 15:41:39.437784 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" podStartSLOduration=2.448835687 podStartE2EDuration="21.437764184s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.847429519 +0000 UTC m=+1096.809945319" lastFinishedPulling="2026-03-20 15:41:38.836358006 +0000 UTC m=+1115.798873816" observedRunningTime="2026-03-20 15:41:39.434352526 +0000 UTC m=+1116.396868326" watchObservedRunningTime="2026-03-20 15:41:39.437764184 +0000 UTC m=+1116.400279984" Mar 20 15:41:41 crc kubenswrapper[4779]: I0320 15:41:41.393450 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" event={"ID":"ccef47f8-908b-4765-abf1-d218024c98bf","Type":"ContainerStarted","Data":"d32c867b260479f86674c70575a90c974cb6f511a63c83283c2fce8036a8255b"} Mar 20 15:41:41 crc kubenswrapper[4779]: I0320 15:41:41.393987 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" Mar 20 15:41:41 crc kubenswrapper[4779]: I0320 15:41:41.394999 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" event={"ID":"f5288943-ed82-43b2-ab12-bab71febb2d9","Type":"ContainerStarted","Data":"377739ef97cff605f9a1d377ce5d5b23a6ac1fa72c52050659b36058575b2314"} Mar 20 15:41:41 crc kubenswrapper[4779]: I0320 15:41:41.395257 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" Mar 20 15:41:41 crc kubenswrapper[4779]: I0320 15:41:41.408873 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" podStartSLOduration=2.738700724 podStartE2EDuration="23.40885771s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.846594169 +0000 UTC m=+1096.809109969" lastFinishedPulling="2026-03-20 15:41:40.516751155 +0000 UTC m=+1117.479266955" observedRunningTime="2026-03-20 15:41:41.408284525 +0000 UTC m=+1118.370800325" watchObservedRunningTime="2026-03-20 15:41:41.40885771 +0000 UTC m=+1118.371373510" Mar 20 15:41:41 crc kubenswrapper[4779]: I0320 15:41:41.427122 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" podStartSLOduration=2.857738349 podStartE2EDuration="23.427084996s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:19.943332304 +0000 UTC m=+1096.905848104" lastFinishedPulling="2026-03-20 15:41:40.512678951 +0000 UTC m=+1117.475194751" observedRunningTime="2026-03-20 15:41:41.424059909 +0000 UTC m=+1118.386575719" watchObservedRunningTime="2026-03-20 15:41:41.427084996 +0000 UTC m=+1118.389600796" Mar 20 15:41:48 crc kubenswrapper[4779]: I0320 15:41:48.592127 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-gqx5g" Mar 20 15:41:48 crc kubenswrapper[4779]: I0320 15:41:48.906605 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-xg8mw" Mar 20 15:41:48 crc kubenswrapper[4779]: I0320 15:41:48.958780 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xtbhm" Mar 20 15:41:48 crc kubenswrapper[4779]: I0320 15:41:48.959655 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pm9mf" Mar 20 15:41:49 crc kubenswrapper[4779]: I0320 15:41:49.166766 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-78b4b86d76-x75t6" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.250236 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.261055 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edf58803-5b67-47e2-a6f1-4998820acc34-cert\") pod \"infra-operator-controller-manager-7b9c774f96-n5x2d\" (UID: \"edf58803-5b67-47e2-a6f1-4998820acc34\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.341120 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.457032 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.463241 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22109be9-adb4-4573-9075-501c52043d47-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-qlcwt\" (UID: \"22109be9-adb4-4573-9075-501c52043d47\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.689406 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.726867 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d"] Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.965056 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.965165 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.968907 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-metrics-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.969400 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39438f44-0aea-412c-84e3-4d013dadd573-webhook-certs\") pod \"openstack-operator-controller-manager-6b5784dbfc-h2nhw\" (UID: \"39438f44-0aea-412c-84e3-4d013dadd573\") " pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:50 crc kubenswrapper[4779]: I0320 15:41:50.995103 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:51 crc kubenswrapper[4779]: I0320 15:41:51.103918 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt"] Mar 20 15:41:51 crc kubenswrapper[4779]: W0320 15:41:51.111806 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22109be9_adb4_4573_9075_501c52043d47.slice/crio-f183f67315fef7cd4f5b2d92008420f01f8e6ac12e9e47c0b72fba91f3e51da0 WatchSource:0}: Error finding container f183f67315fef7cd4f5b2d92008420f01f8e6ac12e9e47c0b72fba91f3e51da0: Status 404 returned error can't find the container with id f183f67315fef7cd4f5b2d92008420f01f8e6ac12e9e47c0b72fba91f3e51da0 Mar 20 15:41:51 crc kubenswrapper[4779]: I0320 15:41:51.409638 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw"] Mar 20 15:41:51 crc kubenswrapper[4779]: I0320 15:41:51.463436 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" event={"ID":"22109be9-adb4-4573-9075-501c52043d47","Type":"ContainerStarted","Data":"f183f67315fef7cd4f5b2d92008420f01f8e6ac12e9e47c0b72fba91f3e51da0"} Mar 20 15:41:51 crc kubenswrapper[4779]: I0320 15:41:51.464770 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" event={"ID":"edf58803-5b67-47e2-a6f1-4998820acc34","Type":"ContainerStarted","Data":"ff5122b704fa99bb56fdec0c381bbed411db325d9137a6ca5fa0535e2baac635"} Mar 20 15:41:51 crc kubenswrapper[4779]: I0320 15:41:51.470512 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" event={"ID":"39438f44-0aea-412c-84e3-4d013dadd573","Type":"ContainerStarted","Data":"fa35ce88f49735a90dfc9736949ef307b16a7ccf736fe4c8cf83deaf9d07ed49"} Mar 20 15:41:52 crc kubenswrapper[4779]: I0320 15:41:52.479092 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" event={"ID":"39438f44-0aea-412c-84e3-4d013dadd573","Type":"ContainerStarted","Data":"db0eea5902ab71c734a5edf1141c8be01578256e00aef9e3f31fef66070ee7b1"} Mar 20 15:41:53 crc kubenswrapper[4779]: I0320 15:41:53.488402 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:41:53 crc kubenswrapper[4779]: I0320 15:41:53.834828 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" podStartSLOduration=35.834808655 podStartE2EDuration="35.834808655s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:53.517304402 +0000 UTC m=+1130.479820242" watchObservedRunningTime="2026-03-20 15:41:53.834808655 +0000 UTC m=+1130.797324455" Mar 20 15:41:56 crc kubenswrapper[4779]: I0320 15:41:56.509001 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" event={"ID":"edf58803-5b67-47e2-a6f1-4998820acc34","Type":"ContainerStarted","Data":"846fc03b1dc8613155a018e525f290f26e3fc37cdea31262f7279d424b005d7b"} Mar 20 15:41:56 crc kubenswrapper[4779]: I0320 15:41:56.509779 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:41:56 crc kubenswrapper[4779]: I0320 15:41:56.511648 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" event={"ID":"22109be9-adb4-4573-9075-501c52043d47","Type":"ContainerStarted","Data":"834adf189402f16de65094995ec52babe1e427cd09911ccf67bb6c70dca173f7"} Mar 20 15:41:56 crc kubenswrapper[4779]: I0320 15:41:56.511782 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:41:56 crc kubenswrapper[4779]: I0320 15:41:56.529541 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" podStartSLOduration=33.344572403 podStartE2EDuration="38.529527261s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:50.733680394 +0000 UTC m=+1127.696196194" lastFinishedPulling="2026-03-20 15:41:55.918635252 +0000 UTC m=+1132.881151052" observedRunningTime="2026-03-20 15:41:56.526530795 +0000 UTC m=+1133.489046595" watchObservedRunningTime="2026-03-20 15:41:56.529527261 +0000 UTC m=+1133.492043061" Mar 20 15:41:56 crc kubenswrapper[4779]: I0320 15:41:56.556808 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" podStartSLOduration=33.733339748 podStartE2EDuration="38.556784478s" podCreationTimestamp="2026-03-20 15:41:18 +0000 UTC" firstStartedPulling="2026-03-20 15:41:51.113480419 +0000 UTC m=+1128.075996219" lastFinishedPulling="2026-03-20 15:41:55.936925149 +0000 UTC m=+1132.899440949" observedRunningTime="2026-03-20 15:41:56.550776754 +0000 UTC m=+1133.513292554" watchObservedRunningTime="2026-03-20 15:41:56.556784478 +0000 UTC m=+1133.519300278" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.136612 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567022-8lmhg"] Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.137449 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-8lmhg" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.139302 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.139856 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.141408 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.145855 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-8lmhg"] Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.189167 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg8bn\" (UniqueName: \"kubernetes.io/projected/df06bf8e-6468-4c2e-bb61-2aa16f6a7caa-kube-api-access-tg8bn\") pod \"auto-csr-approver-29567022-8lmhg\" (UID: \"df06bf8e-6468-4c2e-bb61-2aa16f6a7caa\") " pod="openshift-infra/auto-csr-approver-29567022-8lmhg" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.290855 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg8bn\" (UniqueName: \"kubernetes.io/projected/df06bf8e-6468-4c2e-bb61-2aa16f6a7caa-kube-api-access-tg8bn\") pod \"auto-csr-approver-29567022-8lmhg\" (UID: \"df06bf8e-6468-4c2e-bb61-2aa16f6a7caa\") " pod="openshift-infra/auto-csr-approver-29567022-8lmhg" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.309133 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg8bn\" (UniqueName: \"kubernetes.io/projected/df06bf8e-6468-4c2e-bb61-2aa16f6a7caa-kube-api-access-tg8bn\") pod \"auto-csr-approver-29567022-8lmhg\" (UID: \"df06bf8e-6468-4c2e-bb61-2aa16f6a7caa\") " pod="openshift-infra/auto-csr-approver-29567022-8lmhg" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.453732 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-8lmhg" Mar 20 15:42:00 crc kubenswrapper[4779]: I0320 15:42:00.886193 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-8lmhg"] Mar 20 15:42:01 crc kubenswrapper[4779]: I0320 15:42:01.002737 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b5784dbfc-h2nhw" Mar 20 15:42:01 crc kubenswrapper[4779]: I0320 15:42:01.574196 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-8lmhg" event={"ID":"df06bf8e-6468-4c2e-bb61-2aa16f6a7caa","Type":"ContainerStarted","Data":"6f43ee220771b4ca2a4ff127da9a7c4b5dfc3df1a345f0552da3fe8222abee13"} Mar 20 15:42:02 crc kubenswrapper[4779]: I0320 15:42:02.582481 4779 generic.go:334] "Generic (PLEG): container finished" podID="df06bf8e-6468-4c2e-bb61-2aa16f6a7caa" containerID="398f8029790e97bdb52d60a3e2fa8e76826115ffbdf452916afd083a637188be" exitCode=0 Mar 20 15:42:02 crc kubenswrapper[4779]: I0320 15:42:02.582590 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-8lmhg" event={"ID":"df06bf8e-6468-4c2e-bb61-2aa16f6a7caa","Type":"ContainerDied","Data":"398f8029790e97bdb52d60a3e2fa8e76826115ffbdf452916afd083a637188be"} Mar 20 15:42:03 crc kubenswrapper[4779]: I0320 15:42:03.970333 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-8lmhg" Mar 20 15:42:04 crc kubenswrapper[4779]: I0320 15:42:04.153310 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg8bn\" (UniqueName: \"kubernetes.io/projected/df06bf8e-6468-4c2e-bb61-2aa16f6a7caa-kube-api-access-tg8bn\") pod \"df06bf8e-6468-4c2e-bb61-2aa16f6a7caa\" (UID: \"df06bf8e-6468-4c2e-bb61-2aa16f6a7caa\") " Mar 20 15:42:04 crc kubenswrapper[4779]: I0320 15:42:04.158765 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df06bf8e-6468-4c2e-bb61-2aa16f6a7caa-kube-api-access-tg8bn" (OuterVolumeSpecName: "kube-api-access-tg8bn") pod "df06bf8e-6468-4c2e-bb61-2aa16f6a7caa" (UID: "df06bf8e-6468-4c2e-bb61-2aa16f6a7caa"). InnerVolumeSpecName "kube-api-access-tg8bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:42:04 crc kubenswrapper[4779]: I0320 15:42:04.254990 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg8bn\" (UniqueName: \"kubernetes.io/projected/df06bf8e-6468-4c2e-bb61-2aa16f6a7caa-kube-api-access-tg8bn\") on node \"crc\" DevicePath \"\"" Mar 20 15:42:04 crc kubenswrapper[4779]: I0320 15:42:04.607924 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-8lmhg" event={"ID":"df06bf8e-6468-4c2e-bb61-2aa16f6a7caa","Type":"ContainerDied","Data":"6f43ee220771b4ca2a4ff127da9a7c4b5dfc3df1a345f0552da3fe8222abee13"} Mar 20 15:42:04 crc kubenswrapper[4779]: I0320 15:42:04.608194 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f43ee220771b4ca2a4ff127da9a7c4b5dfc3df1a345f0552da3fe8222abee13" Mar 20 15:42:04 crc kubenswrapper[4779]: I0320 15:42:04.607972 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-8lmhg" Mar 20 15:42:05 crc kubenswrapper[4779]: I0320 15:42:05.067001 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567016-9qvlz"] Mar 20 15:42:05 crc kubenswrapper[4779]: I0320 15:42:05.072927 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567016-9qvlz"] Mar 20 15:42:05 crc kubenswrapper[4779]: I0320 15:42:05.818805 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e7966a-75f6-48c2-9003-6fea92b1edb1" path="/var/lib/kubelet/pods/20e7966a-75f6-48c2-9003-6fea92b1edb1/volumes" Mar 20 15:42:10 crc kubenswrapper[4779]: I0320 15:42:10.347493 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-n5x2d" Mar 20 15:42:10 crc kubenswrapper[4779]: I0320 15:42:10.696345 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-qlcwt" Mar 20 15:42:19 crc kubenswrapper[4779]: I0320 15:42:19.255784 4779 scope.go:117] "RemoveContainer" containerID="9c8ffde1bab2148cfd49f0273d0c2bf07338fc8c4f43ff0fa376ad594fadd4fd" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.562884 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csvht"] Mar 20 15:42:33 crc kubenswrapper[4779]: E0320 15:42:33.563664 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df06bf8e-6468-4c2e-bb61-2aa16f6a7caa" containerName="oc" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.563676 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="df06bf8e-6468-4c2e-bb61-2aa16f6a7caa" containerName="oc" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.563808 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="df06bf8e-6468-4c2e-bb61-2aa16f6a7caa" containerName="oc" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.564594 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.568702 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.568975 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-k9q8j" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.573788 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.583946 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.588215 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csvht"] Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.618013 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j5jxp"] Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.622691 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.629698 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.727988 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j5jxp"] Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.758194 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0beff5b4-9b70-48b2-85df-85a3800232c8-config\") pod \"dnsmasq-dns-675f4bcbfc-csvht\" (UID: \"0beff5b4-9b70-48b2-85df-85a3800232c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.758265 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnpwm\" (UniqueName: \"kubernetes.io/projected/3bb71724-d533-4ba4-97f5-2092d5b9bf37-kube-api-access-xnpwm\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.758292 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-config\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.759204 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.759279 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zwp\" (UniqueName: \"kubernetes.io/projected/0beff5b4-9b70-48b2-85df-85a3800232c8-kube-api-access-s5zwp\") pod \"dnsmasq-dns-675f4bcbfc-csvht\" (UID: \"0beff5b4-9b70-48b2-85df-85a3800232c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.860207 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnpwm\" (UniqueName: \"kubernetes.io/projected/3bb71724-d533-4ba4-97f5-2092d5b9bf37-kube-api-access-xnpwm\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.860269 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-config\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.860340 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.860390 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zwp\" (UniqueName: \"kubernetes.io/projected/0beff5b4-9b70-48b2-85df-85a3800232c8-kube-api-access-s5zwp\") pod \"dnsmasq-dns-675f4bcbfc-csvht\" (UID: \"0beff5b4-9b70-48b2-85df-85a3800232c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.860491 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0beff5b4-9b70-48b2-85df-85a3800232c8-config\") pod \"dnsmasq-dns-675f4bcbfc-csvht\" (UID: \"0beff5b4-9b70-48b2-85df-85a3800232c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.861810 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0beff5b4-9b70-48b2-85df-85a3800232c8-config\") pod \"dnsmasq-dns-675f4bcbfc-csvht\" (UID: \"0beff5b4-9b70-48b2-85df-85a3800232c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.862054 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-config\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.862462 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.883356 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnpwm\" (UniqueName: \"kubernetes.io/projected/3bb71724-d533-4ba4-97f5-2092d5b9bf37-kube-api-access-xnpwm\") pod \"dnsmasq-dns-78dd6ddcc-j5jxp\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.884019 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zwp\" (UniqueName: \"kubernetes.io/projected/0beff5b4-9b70-48b2-85df-85a3800232c8-kube-api-access-s5zwp\") pod \"dnsmasq-dns-675f4bcbfc-csvht\" (UID: \"0beff5b4-9b70-48b2-85df-85a3800232c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.887425 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:42:33 crc kubenswrapper[4779]: I0320 15:42:33.939770 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:42:34 crc kubenswrapper[4779]: I0320 15:42:34.325964 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csvht"] Mar 20 15:42:34 crc kubenswrapper[4779]: I0320 15:42:34.336663 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:42:34 crc kubenswrapper[4779]: I0320 15:42:34.408375 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j5jxp"] Mar 20 15:42:34 crc kubenswrapper[4779]: W0320 15:42:34.412313 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb71724_d533_4ba4_97f5_2092d5b9bf37.slice/crio-7800a1339a1d945f1e29f90bb243df5b46c080320f7a4cb1c8b1a64d572b38d5 WatchSource:0}: Error finding container 7800a1339a1d945f1e29f90bb243df5b46c080320f7a4cb1c8b1a64d572b38d5: Status 404 returned error can't find the container with id 7800a1339a1d945f1e29f90bb243df5b46c080320f7a4cb1c8b1a64d572b38d5 Mar 20 15:42:34 crc kubenswrapper[4779]: I0320 15:42:34.848554 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" event={"ID":"0beff5b4-9b70-48b2-85df-85a3800232c8","Type":"ContainerStarted","Data":"f7a4ef2f0fed1cc6a98b188303cb5f82309b911ce7b10264082c82e1cb1271fb"} Mar 20 15:42:34 crc kubenswrapper[4779]: I0320 15:42:34.850136 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" event={"ID":"3bb71724-d533-4ba4-97f5-2092d5b9bf37","Type":"ContainerStarted","Data":"7800a1339a1d945f1e29f90bb243df5b46c080320f7a4cb1c8b1a64d572b38d5"} Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.467244 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csvht"] Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.494691 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-m68jj"] Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.496097 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.505232 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-m68jj"] Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.507846 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwh26\" (UniqueName: \"kubernetes.io/projected/7c73163f-7930-48aa-991a-e7e8910c069a-kube-api-access-hwh26\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.507916 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.508005 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-config\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.609961 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwh26\" (UniqueName: \"kubernetes.io/projected/7c73163f-7930-48aa-991a-e7e8910c069a-kube-api-access-hwh26\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.610040 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.610072 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-config\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.611177 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-config\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.611224 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.658096 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwh26\" (UniqueName: \"kubernetes.io/projected/7c73163f-7930-48aa-991a-e7e8910c069a-kube-api-access-hwh26\") pod \"dnsmasq-dns-5ccc8479f9-m68jj\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.756685 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j5jxp"] Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.797748 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dp74h"] Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.799301 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.806598 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dp74h"] Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.814317 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.814417 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczzl\" (UniqueName: \"kubernetes.io/projected/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-kube-api-access-bczzl\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.814448 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-config\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.828685 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.917091 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bczzl\" (UniqueName: \"kubernetes.io/projected/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-kube-api-access-bczzl\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.917225 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-config\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.917356 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.920046 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.920320 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-config\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:36 crc kubenswrapper[4779]: I0320 15:42:36.954464 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczzl\" (UniqueName: \"kubernetes.io/projected/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-kube-api-access-bczzl\") pod \"dnsmasq-dns-57d769cc4f-dp74h\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.138860 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.378349 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-m68jj"] Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.481570 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dp74h"] Mar 20 15:42:37 crc kubenswrapper[4779]: W0320 15:42:37.495433 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6e253fe_fec4_49b3_a9a6_22a7da20f3bd.slice/crio-625734d28371900a5166885e249b9dde8ad9d23229e39c193792ad6bb4f1b3d3 WatchSource:0}: Error finding container 625734d28371900a5166885e249b9dde8ad9d23229e39c193792ad6bb4f1b3d3: Status 404 returned error can't find the container with id 625734d28371900a5166885e249b9dde8ad9d23229e39c193792ad6bb4f1b3d3 Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.636992 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.638730 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.644712 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.645000 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.645676 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.647848 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.648671 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.648707 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.648890 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.649261 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5wld4" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.755637 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.755718 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.755754 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.755785 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.755808 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.755844 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3016debc-9603-417f-8ff1-6fd3934cd17e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.755885 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbbj\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-kube-api-access-flbbj\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.755912 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.756005 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.756069 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.756094 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3016debc-9603-417f-8ff1-6fd3934cd17e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.857949 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flbbj\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-kube-api-access-flbbj\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858018 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858250 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858295 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858317 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3016debc-9603-417f-8ff1-6fd3934cd17e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858367 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858402 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858434 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858461 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858481 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.858517 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3016debc-9603-417f-8ff1-6fd3934cd17e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.860458 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.861740 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.862039 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.862850 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.862873 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.863198 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.869140 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.879543 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3016debc-9603-417f-8ff1-6fd3934cd17e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.890010 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3016debc-9603-417f-8ff1-6fd3934cd17e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.891014 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbbj\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-kube-api-access-flbbj\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.892330 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.908048 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" event={"ID":"7c73163f-7930-48aa-991a-e7e8910c069a","Type":"ContainerStarted","Data":"356dd8fd11cb9951f200e2a036b91d86c99f146c266dd3590a143ab689c72e70"} Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.910169 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.910228 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" event={"ID":"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd","Type":"ContainerStarted","Data":"625734d28371900a5166885e249b9dde8ad9d23229e39c193792ad6bb4f1b3d3"} Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.919579 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.921532 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.925757 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7p9wv" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.925792 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.925914 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.925971 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.926018 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.926187 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.926314 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.934576 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961011 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961055 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961243 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961321 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961403 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961455 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74af0a5-e3c3-4569-bece-db2e25e9b79d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961484 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961542 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961637 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx7n5\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-kube-api-access-nx7n5\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961670 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74af0a5-e3c3-4569-bece-db2e25e9b79d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:37 crc kubenswrapper[4779]: I0320 15:42:37.961744 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.001779 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.062944 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.062988 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74af0a5-e3c3-4569-bece-db2e25e9b79d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063021 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063043 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063075 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx7n5\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-kube-api-access-nx7n5\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063092 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74af0a5-e3c3-4569-bece-db2e25e9b79d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063136 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063169 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063191 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063217 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.063236 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.064349 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.064509 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.064808 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.065264 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.065360 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.065615 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.069401 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.069842 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74af0a5-e3c3-4569-bece-db2e25e9b79d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.072573 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74af0a5-e3c3-4569-bece-db2e25e9b79d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.081669 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx7n5\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-kube-api-access-nx7n5\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.089961 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.095954 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " pod="openstack/rabbitmq-server-0" Mar 20 15:42:38 crc kubenswrapper[4779]: I0320 15:42:38.294800 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.002954 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.004112 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.006409 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.006463 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.006662 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4z5fp" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.006753 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.013092 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.019990 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.090144 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.090187 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.090207 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lhzq\" (UniqueName: \"kubernetes.io/projected/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-kube-api-access-9lhzq\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.090230 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.090289 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-config-data-default\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.090312 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.090332 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-kolla-config\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.090370 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.192163 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-config-data-default\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.192228 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.192258 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-kolla-config\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.192312 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.192348 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.192374 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.192392 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lhzq\" (UniqueName: \"kubernetes.io/projected/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-kube-api-access-9lhzq\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.192415 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.193020 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.193977 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-config-data-default\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.194084 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.194977 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-kolla-config\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.195852 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.198085 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.207531 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.215953 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.216894 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lhzq\" (UniqueName: \"kubernetes.io/projected/9b8436ea-e4a7-4a6b-a6a2-d9282bda9696-kube-api-access-9lhzq\") pod \"openstack-galera-0\" (UID: \"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696\") " pod="openstack/openstack-galera-0" Mar 20 15:42:39 crc kubenswrapper[4779]: I0320 15:42:39.337632 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.367743 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.374290 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.377300 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.377300 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.378484 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-wltr7" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.378793 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.382170 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.416554 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bb35660-3a34-4c16-a943-4375cfe12246-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.416618 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb35660-3a34-4c16-a943-4375cfe12246-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.416640 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb35660-3a34-4c16-a943-4375cfe12246-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.416719 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.416747 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxq9d\" (UniqueName: \"kubernetes.io/projected/0bb35660-3a34-4c16-a943-4375cfe12246-kube-api-access-fxq9d\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.416825 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.416858 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.416888 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.518990 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bb35660-3a34-4c16-a943-4375cfe12246-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519040 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb35660-3a34-4c16-a943-4375cfe12246-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519062 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb35660-3a34-4c16-a943-4375cfe12246-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519080 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519098 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxq9d\" (UniqueName: \"kubernetes.io/projected/0bb35660-3a34-4c16-a943-4375cfe12246-kube-api-access-fxq9d\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519153 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519178 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519201 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519519 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bb35660-3a34-4c16-a943-4375cfe12246-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.519786 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.520272 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.520523 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.521385 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb35660-3a34-4c16-a943-4375cfe12246-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.526945 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb35660-3a34-4c16-a943-4375cfe12246-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.542524 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb35660-3a34-4c16-a943-4375cfe12246-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.543075 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxq9d\" (UniqueName: \"kubernetes.io/projected/0bb35660-3a34-4c16-a943-4375cfe12246-kube-api-access-fxq9d\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.547304 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0bb35660-3a34-4c16-a943-4375cfe12246\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.694979 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.783864 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.784774 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.787670 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.787966 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9bccr" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.796693 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.797310 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.824070 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/957258f0-7df6-4c8e-9474-8693ab779860-kolla-config\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.824148 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/957258f0-7df6-4c8e-9474-8693ab779860-config-data\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.824227 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqsxb\" (UniqueName: \"kubernetes.io/projected/957258f0-7df6-4c8e-9474-8693ab779860-kube-api-access-fqsxb\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.824297 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957258f0-7df6-4c8e-9474-8693ab779860-combined-ca-bundle\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.824345 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/957258f0-7df6-4c8e-9474-8693ab779860-memcached-tls-certs\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.925597 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957258f0-7df6-4c8e-9474-8693ab779860-combined-ca-bundle\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.925681 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/957258f0-7df6-4c8e-9474-8693ab779860-memcached-tls-certs\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.925712 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/957258f0-7df6-4c8e-9474-8693ab779860-kolla-config\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.925750 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/957258f0-7df6-4c8e-9474-8693ab779860-config-data\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.925844 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqsxb\" (UniqueName: \"kubernetes.io/projected/957258f0-7df6-4c8e-9474-8693ab779860-kube-api-access-fqsxb\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.927067 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/957258f0-7df6-4c8e-9474-8693ab779860-kolla-config\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.927216 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/957258f0-7df6-4c8e-9474-8693ab779860-config-data\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.930907 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957258f0-7df6-4c8e-9474-8693ab779860-combined-ca-bundle\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.931047 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/957258f0-7df6-4c8e-9474-8693ab779860-memcached-tls-certs\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:40 crc kubenswrapper[4779]: I0320 15:42:40.952660 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqsxb\" (UniqueName: \"kubernetes.io/projected/957258f0-7df6-4c8e-9474-8693ab779860-kube-api-access-fqsxb\") pod \"memcached-0\" (UID: \"957258f0-7df6-4c8e-9474-8693ab779860\") " pod="openstack/memcached-0" Mar 20 15:42:41 crc kubenswrapper[4779]: I0320 15:42:41.099718 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 15:42:42 crc kubenswrapper[4779]: I0320 15:42:42.951833 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:42:42 crc kubenswrapper[4779]: I0320 15:42:42.952935 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:42:42 crc kubenswrapper[4779]: I0320 15:42:42.958337 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2cv5v" Mar 20 15:42:43 crc kubenswrapper[4779]: I0320 15:42:43.009472 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:42:43 crc kubenswrapper[4779]: I0320 15:42:43.083717 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdzn2\" (UniqueName: \"kubernetes.io/projected/49a61cbd-2e62-4e6a-8201-e4d0761885a7-kube-api-access-rdzn2\") pod \"kube-state-metrics-0\" (UID: \"49a61cbd-2e62-4e6a-8201-e4d0761885a7\") " pod="openstack/kube-state-metrics-0" Mar 20 15:42:43 crc kubenswrapper[4779]: I0320 15:42:43.184660 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdzn2\" (UniqueName: \"kubernetes.io/projected/49a61cbd-2e62-4e6a-8201-e4d0761885a7-kube-api-access-rdzn2\") pod \"kube-state-metrics-0\" (UID: \"49a61cbd-2e62-4e6a-8201-e4d0761885a7\") " pod="openstack/kube-state-metrics-0" Mar 20 15:42:43 crc kubenswrapper[4779]: I0320 15:42:43.209516 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdzn2\" (UniqueName: \"kubernetes.io/projected/49a61cbd-2e62-4e6a-8201-e4d0761885a7-kube-api-access-rdzn2\") pod \"kube-state-metrics-0\" (UID: \"49a61cbd-2e62-4e6a-8201-e4d0761885a7\") " pod="openstack/kube-state-metrics-0" Mar 20 15:42:43 crc kubenswrapper[4779]: I0320 15:42:43.272214 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.383602 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.389823 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.396237 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.396280 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.396236 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.396546 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-cfgwn" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.396751 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.396804 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.396952 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.402562 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.404444 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.513250 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.513321 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.513374 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.513410 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.513476 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e93aeb14-abaa-4cff-81c0-82d579020dc6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.513510 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.513531 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.513554 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-config\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.516505 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.516564 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cg4r\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-kube-api-access-2cg4r\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618251 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618309 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618359 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618396 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618464 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e93aeb14-abaa-4cff-81c0-82d579020dc6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618498 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618526 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618553 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-config\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618576 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.618604 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cg4r\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-kube-api-access-2cg4r\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.619178 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.619189 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.619689 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.621350 4779 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.621384 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/63343ced4decb75fd431545090989bf5e441d5fa06a6828c5f39e243f0a750bf/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.623595 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e93aeb14-abaa-4cff-81c0-82d579020dc6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.623971 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-config\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.624258 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.627159 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.644029 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.647255 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cg4r\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-kube-api-access-2cg4r\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.664600 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:44 crc kubenswrapper[4779]: I0320 15:42:44.716954 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.804053 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s9wt4"] Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.805399 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.820619 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-b5sl4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.820958 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.845742 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.859644 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bvnhn"] Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.862164 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.901430 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bvnhn"] Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.961677 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7zgc\" (UniqueName: \"kubernetes.io/projected/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-kube-api-access-m7zgc\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.961758 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-run\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.961802 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-etc-ovs\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.961827 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-run-ovn\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.961856 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-run\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.961879 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-lib\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.961895 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-scripts\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.961917 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a3debf-8de1-4321-b383-5bebca909a38-combined-ca-bundle\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.962041 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-log\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.962071 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73a3debf-8de1-4321-b383-5bebca909a38-scripts\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.962093 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-log-ovn\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.962129 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a3debf-8de1-4321-b383-5bebca909a38-ovn-controller-tls-certs\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.962170 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ths59\" (UniqueName: \"kubernetes.io/projected/73a3debf-8de1-4321-b383-5bebca909a38-kube-api-access-ths59\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:45 crc kubenswrapper[4779]: I0320 15:42:45.982209 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s9wt4"] Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063249 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ths59\" (UniqueName: \"kubernetes.io/projected/73a3debf-8de1-4321-b383-5bebca909a38-kube-api-access-ths59\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063323 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7zgc\" (UniqueName: \"kubernetes.io/projected/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-kube-api-access-m7zgc\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063366 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-run\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063395 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-etc-ovs\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063424 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-run-ovn\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063455 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-run\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063481 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-lib\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063500 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-scripts\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063532 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a3debf-8de1-4321-b383-5bebca909a38-combined-ca-bundle\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063630 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-log\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063654 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73a3debf-8de1-4321-b383-5bebca909a38-scripts\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063679 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-log-ovn\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.063697 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a3debf-8de1-4321-b383-5bebca909a38-ovn-controller-tls-certs\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.065015 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-run\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.066019 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-run\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.066270 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-etc-ovs\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.066318 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-log\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.066529 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-run-ovn\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.066653 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/73a3debf-8de1-4321-b383-5bebca909a38-var-log-ovn\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.072745 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a3debf-8de1-4321-b383-5bebca909a38-ovn-controller-tls-certs\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.074274 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-scripts\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.077366 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-var-lib\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.078213 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73a3debf-8de1-4321-b383-5bebca909a38-scripts\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.085526 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7zgc\" (UniqueName: \"kubernetes.io/projected/2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3-kube-api-access-m7zgc\") pod \"ovn-controller-ovs-bvnhn\" (UID: \"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3\") " pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.085680 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a3debf-8de1-4321-b383-5bebca909a38-combined-ca-bundle\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.094766 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ths59\" (UniqueName: \"kubernetes.io/projected/73a3debf-8de1-4321-b383-5bebca909a38-kube-api-access-ths59\") pod \"ovn-controller-s9wt4\" (UID: \"73a3debf-8de1-4321-b383-5bebca909a38\") " pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.149933 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s9wt4" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.231238 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.240550 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.242142 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.245634 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bx8pj" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.245850 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.245911 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.245997 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.246149 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.261395 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.368706 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.368784 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f74271-1a65-4b70-a927-a4dd7de65360-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.368817 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.368877 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.368978 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09f74271-1a65-4b70-a927-a4dd7de65360-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.369015 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9xnk\" (UniqueName: \"kubernetes.io/projected/09f74271-1a65-4b70-a927-a4dd7de65360-kube-api-access-d9xnk\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.369133 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f74271-1a65-4b70-a927-a4dd7de65360-config\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.369228 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.471241 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.471314 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f74271-1a65-4b70-a927-a4dd7de65360-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.471347 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.471389 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.471414 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09f74271-1a65-4b70-a927-a4dd7de65360-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.471435 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9xnk\" (UniqueName: \"kubernetes.io/projected/09f74271-1a65-4b70-a927-a4dd7de65360-kube-api-access-d9xnk\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.471848 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.471964 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f74271-1a65-4b70-a927-a4dd7de65360-config\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.472071 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09f74271-1a65-4b70-a927-a4dd7de65360-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.472793 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f74271-1a65-4b70-a927-a4dd7de65360-config\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.472825 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f74271-1a65-4b70-a927-a4dd7de65360-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.472955 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.475993 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.476269 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.478695 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f74271-1a65-4b70-a927-a4dd7de65360-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.496037 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9xnk\" (UniqueName: \"kubernetes.io/projected/09f74271-1a65-4b70-a927-a4dd7de65360-kube-api-access-d9xnk\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.498635 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09f74271-1a65-4b70-a927-a4dd7de65360\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:46 crc kubenswrapper[4779]: I0320 15:42:46.573860 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.323605 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.325572 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.328925 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.329353 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tjbgf" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.329487 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.335292 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.343134 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.446234 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.446314 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/baa96376-1415-4a97-853e-cde55a1d6860-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.446337 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/baa96376-1415-4a97-853e-cde55a1d6860-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.446350 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa96376-1415-4a97-853e-cde55a1d6860-config\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.446379 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.446401 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.446440 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.446485 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcbf\" (UniqueName: \"kubernetes.io/projected/baa96376-1415-4a97-853e-cde55a1d6860-kube-api-access-sfcbf\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.547847 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.547932 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/baa96376-1415-4a97-853e-cde55a1d6860-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.547953 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/baa96376-1415-4a97-853e-cde55a1d6860-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.547968 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa96376-1415-4a97-853e-cde55a1d6860-config\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.547994 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.548018 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.548061 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.548130 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcbf\" (UniqueName: \"kubernetes.io/projected/baa96376-1415-4a97-853e-cde55a1d6860-kube-api-access-sfcbf\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.550895 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/baa96376-1415-4a97-853e-cde55a1d6860-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.551405 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa96376-1415-4a97-853e-cde55a1d6860-config\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.551677 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.560242 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/baa96376-1415-4a97-853e-cde55a1d6860-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.564994 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.565127 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.579205 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa96376-1415-4a97-853e-cde55a1d6860-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.587444 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.599443 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcbf\" (UniqueName: \"kubernetes.io/projected/baa96376-1415-4a97-853e-cde55a1d6860-kube-api-access-sfcbf\") pod \"ovsdbserver-sb-0\" (UID: \"baa96376-1415-4a97-853e-cde55a1d6860\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:50 crc kubenswrapper[4779]: I0320 15:42:50.648076 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 15:42:57 crc kubenswrapper[4779]: I0320 15:42:55.150233 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:42:57 crc kubenswrapper[4779]: I0320 15:42:55.150666 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:43:05 crc kubenswrapper[4779]: E0320 15:43:05.558913 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 15:43:05 crc kubenswrapper[4779]: E0320 15:43:05.559620 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwh26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-m68jj_openstack(7c73163f-7930-48aa-991a-e7e8910c069a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:05 crc kubenswrapper[4779]: E0320 15:43:05.560872 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" Mar 20 15:43:05 crc kubenswrapper[4779]: E0320 15:43:05.884638 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 15:43:05 crc kubenswrapper[4779]: E0320 15:43:05.885082 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bczzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-dp74h_openstack(b6e253fe-fec4-49b3-a9a6-22a7da20f3bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:05 crc kubenswrapper[4779]: E0320 15:43:05.887047 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" Mar 20 15:43:06 crc kubenswrapper[4779]: E0320 15:43:06.165911 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" Mar 20 15:43:06 crc kubenswrapper[4779]: E0320 15:43:06.166018 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.459282 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.459329 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.463765 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.565184 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s9wt4"] Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.569725 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.575522 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: W0320 15:43:06.577798 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a61cbd_2e62_4e6a_8201_e4d0761885a7.slice/crio-57cfcf5ea6ef366b8211f4a7daf6bbffa9cffd716637322653e9459fed943e02 WatchSource:0}: Error finding container 57cfcf5ea6ef366b8211f4a7daf6bbffa9cffd716637322653e9459fed943e02: Status 404 returned error can't find the container with id 57cfcf5ea6ef366b8211f4a7daf6bbffa9cffd716637322653e9459fed943e02 Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.584743 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: E0320 15:43:06.600678 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 15:43:06 crc kubenswrapper[4779]: E0320 15:43:06.600838 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s5zwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-csvht_openstack(0beff5b4-9b70-48b2-85df-85a3800232c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:06 crc kubenswrapper[4779]: E0320 15:43:06.602030 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" podUID="0beff5b4-9b70-48b2-85df-85a3800232c8" Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.705740 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: E0320 15:43:06.857395 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 15:43:06 crc kubenswrapper[4779]: E0320 15:43:06.857583 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xnpwm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-j5jxp_openstack(3bb71724-d533-4ba4-97f5-2092d5b9bf37): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:06 crc kubenswrapper[4779]: E0320 15:43:06.859006 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" podUID="3bb71724-d533-4ba4-97f5-2092d5b9bf37" Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.872664 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: W0320 15:43:06.881307 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3016debc_9603_417f_8ff1_6fd3934cd17e.slice/crio-4e62fbafb643ea8b3a0486879797d57107a30191f6fbbcbcf6812570aaf83e42 WatchSource:0}: Error finding container 4e62fbafb643ea8b3a0486879797d57107a30191f6fbbcbcf6812570aaf83e42: Status 404 returned error can't find the container with id 4e62fbafb643ea8b3a0486879797d57107a30191f6fbbcbcf6812570aaf83e42 Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.895632 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bvnhn"] Mar 20 15:43:06 crc kubenswrapper[4779]: W0320 15:43:06.901094 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eeb0ff9_1bf6_4e5b_a11f_c3d85df38cb3.slice/crio-056139c22914cda3b95d8491aa8cf2325d01d5e5e5e3a4d5af50eebbb9b65082 WatchSource:0}: Error finding container 056139c22914cda3b95d8491aa8cf2325d01d5e5e5e3a4d5af50eebbb9b65082: Status 404 returned error can't find the container with id 056139c22914cda3b95d8491aa8cf2325d01d5e5e5e3a4d5af50eebbb9b65082 Mar 20 15:43:06 crc kubenswrapper[4779]: I0320 15:43:06.951262 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 15:43:06 crc kubenswrapper[4779]: W0320 15:43:06.955946 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaa96376_1415_4a97_853e_cde55a1d6860.slice/crio-5b24cfee570d0a22d94b201eb997d247b05a10ed32c577808abd053bf14fbc80 WatchSource:0}: Error finding container 5b24cfee570d0a22d94b201eb997d247b05a10ed32c577808abd053bf14fbc80: Status 404 returned error can't find the container with id 5b24cfee570d0a22d94b201eb997d247b05a10ed32c577808abd053bf14fbc80 Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.178724 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s9wt4" event={"ID":"73a3debf-8de1-4321-b383-5bebca909a38","Type":"ContainerStarted","Data":"6216f55ff858cb81247a4ada2de960a17e00ee3624b7083ca3c853d4a00e525a"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.180711 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0bb35660-3a34-4c16-a943-4375cfe12246","Type":"ContainerStarted","Data":"30948daef2373e2d2a7b1f4622e44fdfa819385cff8305c8e84158ae0b84f014"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.183552 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09f74271-1a65-4b70-a927-a4dd7de65360","Type":"ContainerStarted","Data":"128b707abbfa3710740c623fac7d6eaf69f9bd2801061f501834497150650cfb"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.184995 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"49a61cbd-2e62-4e6a-8201-e4d0761885a7","Type":"ContainerStarted","Data":"57cfcf5ea6ef366b8211f4a7daf6bbffa9cffd716637322653e9459fed943e02"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.188102 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"957258f0-7df6-4c8e-9474-8693ab779860","Type":"ContainerStarted","Data":"6923b2c0b4ecef57d4d49f382dab0545d2edf662dba6fa744fd31d1ed5fd0de9"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.188889 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bvnhn" event={"ID":"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3","Type":"ContainerStarted","Data":"056139c22914cda3b95d8491aa8cf2325d01d5e5e5e3a4d5af50eebbb9b65082"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.190027 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696","Type":"ContainerStarted","Data":"f34e84adde4b8992177bee3f64690b7e6d201855f03f0e48231984ebc70f14f6"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.191286 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74af0a5-e3c3-4569-bece-db2e25e9b79d","Type":"ContainerStarted","Data":"27cd622489a45c7dd259e27831a13dbd7bf4ffa9c19ccda9387d040940bd939c"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.192554 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3016debc-9603-417f-8ff1-6fd3934cd17e","Type":"ContainerStarted","Data":"4e62fbafb643ea8b3a0486879797d57107a30191f6fbbcbcf6812570aaf83e42"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.193814 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerStarted","Data":"999146a747f98d6e3aa4905c0625210b77738d902368284ef54fc90d1a0c2a6a"} Mar 20 15:43:07 crc kubenswrapper[4779]: I0320 15:43:07.195464 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"baa96376-1415-4a97-853e-cde55a1d6860","Type":"ContainerStarted","Data":"5b24cfee570d0a22d94b201eb997d247b05a10ed32c577808abd053bf14fbc80"} Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.121825 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.140034 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.170144 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-dns-svc\") pod \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.170258 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5zwp\" (UniqueName: \"kubernetes.io/projected/0beff5b4-9b70-48b2-85df-85a3800232c8-kube-api-access-s5zwp\") pod \"0beff5b4-9b70-48b2-85df-85a3800232c8\" (UID: \"0beff5b4-9b70-48b2-85df-85a3800232c8\") " Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.170370 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0beff5b4-9b70-48b2-85df-85a3800232c8-config\") pod \"0beff5b4-9b70-48b2-85df-85a3800232c8\" (UID: \"0beff5b4-9b70-48b2-85df-85a3800232c8\") " Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.170403 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-config\") pod \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.170450 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnpwm\" (UniqueName: \"kubernetes.io/projected/3bb71724-d533-4ba4-97f5-2092d5b9bf37-kube-api-access-xnpwm\") pod \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\" (UID: \"3bb71724-d533-4ba4-97f5-2092d5b9bf37\") " Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.170888 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bb71724-d533-4ba4-97f5-2092d5b9bf37" (UID: "3bb71724-d533-4ba4-97f5-2092d5b9bf37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.170960 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0beff5b4-9b70-48b2-85df-85a3800232c8-config" (OuterVolumeSpecName: "config") pod "0beff5b4-9b70-48b2-85df-85a3800232c8" (UID: "0beff5b4-9b70-48b2-85df-85a3800232c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.170988 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-config" (OuterVolumeSpecName: "config") pod "3bb71724-d533-4ba4-97f5-2092d5b9bf37" (UID: "3bb71724-d533-4ba4-97f5-2092d5b9bf37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.176774 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb71724-d533-4ba4-97f5-2092d5b9bf37-kube-api-access-xnpwm" (OuterVolumeSpecName: "kube-api-access-xnpwm") pod "3bb71724-d533-4ba4-97f5-2092d5b9bf37" (UID: "3bb71724-d533-4ba4-97f5-2092d5b9bf37"). InnerVolumeSpecName "kube-api-access-xnpwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.177199 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0beff5b4-9b70-48b2-85df-85a3800232c8-kube-api-access-s5zwp" (OuterVolumeSpecName: "kube-api-access-s5zwp") pod "0beff5b4-9b70-48b2-85df-85a3800232c8" (UID: "0beff5b4-9b70-48b2-85df-85a3800232c8"). InnerVolumeSpecName "kube-api-access-s5zwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.207660 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" event={"ID":"0beff5b4-9b70-48b2-85df-85a3800232c8","Type":"ContainerDied","Data":"f7a4ef2f0fed1cc6a98b188303cb5f82309b911ce7b10264082c82e1cb1271fb"} Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.207768 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csvht" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.214649 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" event={"ID":"3bb71724-d533-4ba4-97f5-2092d5b9bf37","Type":"ContainerDied","Data":"7800a1339a1d945f1e29f90bb243df5b46c080320f7a4cb1c8b1a64d572b38d5"} Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.214734 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j5jxp" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.272131 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.272160 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5zwp\" (UniqueName: \"kubernetes.io/projected/0beff5b4-9b70-48b2-85df-85a3800232c8-kube-api-access-s5zwp\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.272171 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0beff5b4-9b70-48b2-85df-85a3800232c8-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.272179 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb71724-d533-4ba4-97f5-2092d5b9bf37-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.272187 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnpwm\" (UniqueName: \"kubernetes.io/projected/3bb71724-d533-4ba4-97f5-2092d5b9bf37-kube-api-access-xnpwm\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.279636 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csvht"] Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.291396 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csvht"] Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.314675 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j5jxp"] Mar 20 15:43:08 crc kubenswrapper[4779]: I0320 15:43:08.320830 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j5jxp"] Mar 20 15:43:09 crc kubenswrapper[4779]: I0320 15:43:09.819423 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0beff5b4-9b70-48b2-85df-85a3800232c8" path="/var/lib/kubelet/pods/0beff5b4-9b70-48b2-85df-85a3800232c8/volumes" Mar 20 15:43:09 crc kubenswrapper[4779]: I0320 15:43:09.819850 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb71724-d533-4ba4-97f5-2092d5b9bf37" path="/var/lib/kubelet/pods/3bb71724-d533-4ba4-97f5-2092d5b9bf37/volumes" Mar 20 15:43:23 crc kubenswrapper[4779]: E0320 15:43:23.895563 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 20 15:43:23 crc kubenswrapper[4779]: E0320 15:43:23.896416 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lhzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(9b8436ea-e4a7-4a6b-a6a2-d9282bda9696): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:23 crc kubenswrapper[4779]: E0320 15:43:23.897626 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="9b8436ea-e4a7-4a6b-a6a2-d9282bda9696" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.228247 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.228420 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65ch5bbh96h56bh78h58fh64fh564h96h74h5bfh6dh6dhd4h56fhddh677h5bfh67h69h56fh65ch588h4h6h648h646h566h584h57h5c7h648q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ths59,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-s9wt4_openstack(73a3debf-8de1-4321-b383-5bebca909a38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.230522 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-s9wt4" podUID="73a3debf-8de1-4321-b383-5bebca909a38" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.332749 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="9b8436ea-e4a7-4a6b-a6a2-d9282bda9696" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.333062 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-s9wt4" podUID="73a3debf-8de1-4321-b383-5bebca909a38" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.551778 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:e4412f5688c9725f36d2f566f624d82a1a2a5b957686245fd2defcc39604bdc2" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.551963 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init-config-reloader,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:e4412f5688c9725f36d2f566f624d82a1a2a5b957686245fd2defcc39604bdc2,Command:[/bin/prometheus-config-reloader],Args:[--watch-interval=0 --listen-address=:8081 --config-file=/etc/prometheus/config/prometheus.yaml.gz --config-envsubst-file=/etc/prometheus/config_out/prometheus.env.yaml --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0 --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1 --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:reloader-init,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:SHARD,Value:0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/prometheus/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-out,ReadOnly:false,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-1,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-2,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2cg4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(e93aeb14-abaa-4cff-81c0-82d579020dc6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.553675 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.557656 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.557908 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65ch5bbh96h56bh78h58fh64fh564h96h74h5bfh6dh6dhd4h56fhddh677h5bfh67h69h56fh65ch588h4h6h648h646h566h584h57h5c7h648q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7zgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-bvnhn_openstack(2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.559215 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.559237 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-bvnhn" podUID="2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.559375 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flbbj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(3016debc-9603-417f-8ff1-6fd3934cd17e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:24 crc kubenswrapper[4779]: E0320 15:43:24.562250 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" Mar 20 15:43:25 crc kubenswrapper[4779]: I0320 15:43:25.149777 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:43:25 crc kubenswrapper[4779]: I0320 15:43:25.149835 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:43:25 crc kubenswrapper[4779]: E0320 15:43:25.340554 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-bvnhn" podUID="2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3" Mar 20 15:43:25 crc kubenswrapper[4779]: E0320 15:43:25.340643 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:e4412f5688c9725f36d2f566f624d82a1a2a5b957686245fd2defcc39604bdc2\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" Mar 20 15:43:25 crc kubenswrapper[4779]: E0320 15:43:25.341013 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" Mar 20 15:43:25 crc kubenswrapper[4779]: E0320 15:43:25.350082 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Mar 20 15:43:25 crc kubenswrapper[4779]: E0320 15:43:25.350367 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f7h57dh5f7h5bh65bh65bhbch55bhddh97hd5h6ch9dh6h5d4h56ch57h584h75h54fh5bdhdfh644h74h56ch666h598h559h559h56h59bh5cdq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9xnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(09f74271-1a65-4b70-a927-a4dd7de65360): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:25 crc kubenswrapper[4779]: E0320 15:43:25.705123 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 20 15:43:25 crc kubenswrapper[4779]: E0320 15:43:25.705277 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fxq9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(0bb35660-3a34-4c16-a943-4375cfe12246): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:25 crc kubenswrapper[4779]: E0320 15:43:25.706446 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="0bb35660-3a34-4c16-a943-4375cfe12246" Mar 20 15:43:26 crc kubenswrapper[4779]: E0320 15:43:25.999889 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Mar 20 15:43:26 crc kubenswrapper[4779]: E0320 15:43:26.000407 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55bh58ch5f6hd6hc9h5f9h5fdhbbh7fh584h588h5ch5ch58bh68h5cchf4h8fh5f8hf5h54h587h5d5h69h67dhfch66h669hcch646h547h65fq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfcbf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(baa96376-1415-4a97-853e-cde55a1d6860): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:43:26 crc kubenswrapper[4779]: E0320 15:43:26.345354 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="0bb35660-3a34-4c16-a943-4375cfe12246" Mar 20 15:43:27 crc kubenswrapper[4779]: E0320 15:43:27.507401 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 15:43:27 crc kubenswrapper[4779]: E0320 15:43:27.507764 4779 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 15:43:27 crc kubenswrapper[4779]: E0320 15:43:27.507914 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rdzn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(49a61cbd-2e62-4e6a-8201-e4d0761885a7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 15:43:27 crc kubenswrapper[4779]: E0320 15:43:27.509082 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="49a61cbd-2e62-4e6a-8201-e4d0761885a7" Mar 20 15:43:28 crc kubenswrapper[4779]: E0320 15:43:28.358081 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="49a61cbd-2e62-4e6a-8201-e4d0761885a7" Mar 20 15:43:29 crc kubenswrapper[4779]: E0320 15:43:29.986739 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="09f74271-1a65-4b70-a927-a4dd7de65360" Mar 20 15:43:29 crc kubenswrapper[4779]: E0320 15:43:29.987638 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="baa96376-1415-4a97-853e-cde55a1d6860" Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.378170 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"957258f0-7df6-4c8e-9474-8693ab779860","Type":"ContainerStarted","Data":"209033acd75a3df57cbe04b28db634731c764b4645bfccf61aa56cdcf3eed71d"} Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.378649 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.383023 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"baa96376-1415-4a97-853e-cde55a1d6860","Type":"ContainerStarted","Data":"d4476c65650a8f0543f4029d4c7d0a9dd0fa1fa916142c513f57221a75618c23"} Mar 20 15:43:30 crc kubenswrapper[4779]: E0320 15:43:30.386698 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="baa96376-1415-4a97-853e-cde55a1d6860" Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.387057 4779 generic.go:334] "Generic (PLEG): container finished" podID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" containerID="fed20c0f0597299f0423fd2aae2f41d96a06471814a8ecdcdf2df637e5d427f5" exitCode=0 Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.387192 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" event={"ID":"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd","Type":"ContainerDied","Data":"fed20c0f0597299f0423fd2aae2f41d96a06471814a8ecdcdf2df637e5d427f5"} Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.392434 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09f74271-1a65-4b70-a927-a4dd7de65360","Type":"ContainerStarted","Data":"da4bb12e4fa16ad87ff1b66f1c8afdb7aa8eb45cc9650d2b8dfe3a9ec564e6b2"} Mar 20 15:43:30 crc kubenswrapper[4779]: E0320 15:43:30.394089 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="09f74271-1a65-4b70-a927-a4dd7de65360" Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.394601 4779 generic.go:334] "Generic (PLEG): container finished" podID="7c73163f-7930-48aa-991a-e7e8910c069a" containerID="07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d" exitCode=0 Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.394649 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" event={"ID":"7c73163f-7930-48aa-991a-e7e8910c069a","Type":"ContainerDied","Data":"07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d"} Mar 20 15:43:30 crc kubenswrapper[4779]: I0320 15:43:30.407704 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=31.207520436 podStartE2EDuration="50.407680552s" podCreationTimestamp="2026-03-20 15:42:40 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.508870165 +0000 UTC m=+1203.471385965" lastFinishedPulling="2026-03-20 15:43:25.709030281 +0000 UTC m=+1222.671546081" observedRunningTime="2026-03-20 15:43:30.39637767 +0000 UTC m=+1227.358893510" watchObservedRunningTime="2026-03-20 15:43:30.407680552 +0000 UTC m=+1227.370196382" Mar 20 15:43:31 crc kubenswrapper[4779]: I0320 15:43:31.402440 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74af0a5-e3c3-4569-bece-db2e25e9b79d","Type":"ContainerStarted","Data":"3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc"} Mar 20 15:43:31 crc kubenswrapper[4779]: I0320 15:43:31.405402 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" event={"ID":"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd","Type":"ContainerStarted","Data":"2e34e237c316ed1ce8b188a02c2057939f604f87626d200c86bed0b0f8f6cbb2"} Mar 20 15:43:31 crc kubenswrapper[4779]: I0320 15:43:31.405812 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:43:31 crc kubenswrapper[4779]: I0320 15:43:31.408797 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" event={"ID":"7c73163f-7930-48aa-991a-e7e8910c069a","Type":"ContainerStarted","Data":"98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c"} Mar 20 15:43:31 crc kubenswrapper[4779]: I0320 15:43:31.409219 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:43:31 crc kubenswrapper[4779]: E0320 15:43:31.410604 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="baa96376-1415-4a97-853e-cde55a1d6860" Mar 20 15:43:31 crc kubenswrapper[4779]: E0320 15:43:31.411048 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="09f74271-1a65-4b70-a927-a4dd7de65360" Mar 20 15:43:31 crc kubenswrapper[4779]: I0320 15:43:31.511480 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" podStartSLOduration=3.716942914 podStartE2EDuration="55.511459953s" podCreationTimestamp="2026-03-20 15:42:36 +0000 UTC" firstStartedPulling="2026-03-20 15:42:37.432735469 +0000 UTC m=+1174.395251269" lastFinishedPulling="2026-03-20 15:43:29.227252508 +0000 UTC m=+1226.189768308" observedRunningTime="2026-03-20 15:43:31.508926251 +0000 UTC m=+1228.471442061" watchObservedRunningTime="2026-03-20 15:43:31.511459953 +0000 UTC m=+1228.473975743" Mar 20 15:43:31 crc kubenswrapper[4779]: I0320 15:43:31.536750 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" podStartSLOduration=3.569469533 podStartE2EDuration="55.536732494s" podCreationTimestamp="2026-03-20 15:42:36 +0000 UTC" firstStartedPulling="2026-03-20 15:42:37.497664648 +0000 UTC m=+1174.460180448" lastFinishedPulling="2026-03-20 15:43:29.464927609 +0000 UTC m=+1226.427443409" observedRunningTime="2026-03-20 15:43:31.532399686 +0000 UTC m=+1228.494915486" watchObservedRunningTime="2026-03-20 15:43:31.536732494 +0000 UTC m=+1228.499248294" Mar 20 15:43:36 crc kubenswrapper[4779]: I0320 15:43:36.101079 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 15:43:36 crc kubenswrapper[4779]: I0320 15:43:36.830261 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.141374 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.191883 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-m68jj"] Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.452651 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" containerName="dnsmasq-dns" containerID="cri-o://98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c" gracePeriod=10 Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.890590 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.945980 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-config\") pod \"7c73163f-7930-48aa-991a-e7e8910c069a\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.946156 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-dns-svc\") pod \"7c73163f-7930-48aa-991a-e7e8910c069a\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.946288 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwh26\" (UniqueName: \"kubernetes.io/projected/7c73163f-7930-48aa-991a-e7e8910c069a-kube-api-access-hwh26\") pod \"7c73163f-7930-48aa-991a-e7e8910c069a\" (UID: \"7c73163f-7930-48aa-991a-e7e8910c069a\") " Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.950876 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c73163f-7930-48aa-991a-e7e8910c069a-kube-api-access-hwh26" (OuterVolumeSpecName: "kube-api-access-hwh26") pod "7c73163f-7930-48aa-991a-e7e8910c069a" (UID: "7c73163f-7930-48aa-991a-e7e8910c069a"). InnerVolumeSpecName "kube-api-access-hwh26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.981341 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c73163f-7930-48aa-991a-e7e8910c069a" (UID: "7c73163f-7930-48aa-991a-e7e8910c069a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:37 crc kubenswrapper[4779]: I0320 15:43:37.981953 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-config" (OuterVolumeSpecName: "config") pod "7c73163f-7930-48aa-991a-e7e8910c069a" (UID: "7c73163f-7930-48aa-991a-e7e8910c069a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.048619 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwh26\" (UniqueName: \"kubernetes.io/projected/7c73163f-7930-48aa-991a-e7e8910c069a-kube-api-access-hwh26\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.048873 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.048984 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c73163f-7930-48aa-991a-e7e8910c069a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.460644 4779 generic.go:334] "Generic (PLEG): container finished" podID="7c73163f-7930-48aa-991a-e7e8910c069a" containerID="98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c" exitCode=0 Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.460960 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" event={"ID":"7c73163f-7930-48aa-991a-e7e8910c069a","Type":"ContainerDied","Data":"98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c"} Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.461024 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.461137 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-m68jj" event={"ID":"7c73163f-7930-48aa-991a-e7e8910c069a","Type":"ContainerDied","Data":"356dd8fd11cb9951f200e2a036b91d86c99f146c266dd3590a143ab689c72e70"} Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.461187 4779 scope.go:117] "RemoveContainer" containerID="98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.465650 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s9wt4" event={"ID":"73a3debf-8de1-4321-b383-5bebca909a38","Type":"ContainerStarted","Data":"c7606c7a629e46e5c87a01f5ad0c11532dd7b21f025d8e4810fd0b5beee87735"} Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.465875 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s9wt4" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.484094 4779 scope.go:117] "RemoveContainer" containerID="07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.501948 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s9wt4" podStartSLOduration=22.747227394 podStartE2EDuration="53.501927128s" podCreationTimestamp="2026-03-20 15:42:45 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.560753899 +0000 UTC m=+1203.523269699" lastFinishedPulling="2026-03-20 15:43:37.315453633 +0000 UTC m=+1234.277969433" observedRunningTime="2026-03-20 15:43:38.486670999 +0000 UTC m=+1235.449186799" watchObservedRunningTime="2026-03-20 15:43:38.501927128 +0000 UTC m=+1235.464442928" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.504122 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-m68jj"] Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.509549 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-m68jj"] Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.621904 4779 scope.go:117] "RemoveContainer" containerID="98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c" Mar 20 15:43:38 crc kubenswrapper[4779]: E0320 15:43:38.622406 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c\": container with ID starting with 98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c not found: ID does not exist" containerID="98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.622445 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c"} err="failed to get container status \"98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c\": rpc error: code = NotFound desc = could not find container \"98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c\": container with ID starting with 98f00b54cfbeff82cef36ec657ceeb13bbf8699437f9a94cd8c37b4cef16f87c not found: ID does not exist" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.622468 4779 scope.go:117] "RemoveContainer" containerID="07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d" Mar 20 15:43:38 crc kubenswrapper[4779]: E0320 15:43:38.622772 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d\": container with ID starting with 07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d not found: ID does not exist" containerID="07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d" Mar 20 15:43:38 crc kubenswrapper[4779]: I0320 15:43:38.622796 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d"} err="failed to get container status \"07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d\": rpc error: code = NotFound desc = could not find container \"07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d\": container with ID starting with 07005395d72f51c5edf56fa4bc257c7d98e5fddb1e60330dcc032bc177b8b97d not found: ID does not exist" Mar 20 15:43:39 crc kubenswrapper[4779]: I0320 15:43:39.477287 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3016debc-9603-417f-8ff1-6fd3934cd17e","Type":"ContainerStarted","Data":"bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5"} Mar 20 15:43:39 crc kubenswrapper[4779]: I0320 15:43:39.819452 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" path="/var/lib/kubelet/pods/7c73163f-7930-48aa-991a-e7e8910c069a/volumes" Mar 20 15:43:40 crc kubenswrapper[4779]: I0320 15:43:40.486311 4779 generic.go:334] "Generic (PLEG): container finished" podID="2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3" containerID="9f8736124685a80b4858efcf657aa150fe1e8ae5cd8c9e420f5fe2f6e17c5145" exitCode=0 Mar 20 15:43:40 crc kubenswrapper[4779]: I0320 15:43:40.486459 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bvnhn" event={"ID":"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3","Type":"ContainerDied","Data":"9f8736124685a80b4858efcf657aa150fe1e8ae5cd8c9e420f5fe2f6e17c5145"} Mar 20 15:43:40 crc kubenswrapper[4779]: I0320 15:43:40.489128 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696","Type":"ContainerStarted","Data":"9e620b8c712727f6c18f5ff7ff349913b50a4b37b4837d44089d7aafc9ee35a9"} Mar 20 15:43:40 crc kubenswrapper[4779]: I0320 15:43:40.493034 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerStarted","Data":"8e4730bb63686853f3e300c75d90cb2b290843398e8244bebc62b19ff8802e60"} Mar 20 15:43:41 crc kubenswrapper[4779]: I0320 15:43:41.501685 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bvnhn" event={"ID":"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3","Type":"ContainerStarted","Data":"769261bcdf70d95a285c3579d87a20289e261c78d32600c82c8756790ef77cea"} Mar 20 15:43:41 crc kubenswrapper[4779]: I0320 15:43:41.501956 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bvnhn" event={"ID":"2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3","Type":"ContainerStarted","Data":"62eadaae7e66f3ffd60f48b610166afb3d903c6405eb4d9cd3ef1d694391ccb9"} Mar 20 15:43:41 crc kubenswrapper[4779]: I0320 15:43:41.502504 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:43:41 crc kubenswrapper[4779]: I0320 15:43:41.502531 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:43:41 crc kubenswrapper[4779]: I0320 15:43:41.524173 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bvnhn" podStartSLOduration=24.189982284 podStartE2EDuration="56.524155498s" podCreationTimestamp="2026-03-20 15:42:45 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.903758398 +0000 UTC m=+1203.866274198" lastFinishedPulling="2026-03-20 15:43:39.237931612 +0000 UTC m=+1236.200447412" observedRunningTime="2026-03-20 15:43:41.518969239 +0000 UTC m=+1238.481485039" watchObservedRunningTime="2026-03-20 15:43:41.524155498 +0000 UTC m=+1238.486671298" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.399158 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9tpp7"] Mar 20 15:43:43 crc kubenswrapper[4779]: E0320 15:43:43.401370 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" containerName="init" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.401384 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" containerName="init" Mar 20 15:43:43 crc kubenswrapper[4779]: E0320 15:43:43.401422 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" containerName="dnsmasq-dns" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.401428 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" containerName="dnsmasq-dns" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.401604 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c73163f-7930-48aa-991a-e7e8910c069a" containerName="dnsmasq-dns" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.402428 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.412022 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9tpp7"] Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.531068 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"49a61cbd-2e62-4e6a-8201-e4d0761885a7","Type":"ContainerStarted","Data":"49ae739bd70117adbae916a5cab791713c3f3b366468b63a172e726189c84e44"} Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.531490 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.532825 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0bb35660-3a34-4c16-a943-4375cfe12246","Type":"ContainerStarted","Data":"47b023ea2ffa97e2bc514c0f24943137e162043d6de3e0de4e1f4176dae5aafb"} Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.539598 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.539652 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5nf\" (UniqueName: \"kubernetes.io/projected/2295b246-976d-4a92-8d85-ee69b7b056ee-kube-api-access-7z5nf\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.539694 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-config\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.553334 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.936917256 podStartE2EDuration="1m1.553318397s" podCreationTimestamp="2026-03-20 15:42:42 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.581097007 +0000 UTC m=+1203.543612807" lastFinishedPulling="2026-03-20 15:43:43.197498148 +0000 UTC m=+1240.160013948" observedRunningTime="2026-03-20 15:43:43.548696532 +0000 UTC m=+1240.511212332" watchObservedRunningTime="2026-03-20 15:43:43.553318397 +0000 UTC m=+1240.515834197" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.641213 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.641276 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5nf\" (UniqueName: \"kubernetes.io/projected/2295b246-976d-4a92-8d85-ee69b7b056ee-kube-api-access-7z5nf\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.641333 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-config\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.642534 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-config\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.642690 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.682922 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5nf\" (UniqueName: \"kubernetes.io/projected/2295b246-976d-4a92-8d85-ee69b7b056ee-kube-api-access-7z5nf\") pod \"dnsmasq-dns-7cb5889db5-9tpp7\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:43 crc kubenswrapper[4779]: I0320 15:43:43.804746 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.353827 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9tpp7"] Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.542327 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" event={"ID":"2295b246-976d-4a92-8d85-ee69b7b056ee","Type":"ContainerStarted","Data":"1575b615123f3c7a85629e1da4161319d0858ec1734ed5c03bdfa5c546393cc5"} Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.544049 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"baa96376-1415-4a97-853e-cde55a1d6860","Type":"ContainerStarted","Data":"68257526a63f1e2546c399863c9d0b45cdc150f7140f094ce6e72cbf415b0756"} Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.545860 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09f74271-1a65-4b70-a927-a4dd7de65360","Type":"ContainerStarted","Data":"22b5663b386478e788929f7fc226fb0bdfdf7490590a33c460f19590c7059c77"} Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.547692 4779 generic.go:334] "Generic (PLEG): container finished" podID="9b8436ea-e4a7-4a6b-a6a2-d9282bda9696" containerID="9e620b8c712727f6c18f5ff7ff349913b50a4b37b4837d44089d7aafc9ee35a9" exitCode=0 Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.547811 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696","Type":"ContainerDied","Data":"9e620b8c712727f6c18f5ff7ff349913b50a4b37b4837d44089d7aafc9ee35a9"} Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.566627 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.999060334 podStartE2EDuration="55.56660588s" podCreationTimestamp="2026-03-20 15:42:49 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.958614356 +0000 UTC m=+1203.921130156" lastFinishedPulling="2026-03-20 15:43:43.526159902 +0000 UTC m=+1240.488675702" observedRunningTime="2026-03-20 15:43:44.56582006 +0000 UTC m=+1241.528335860" watchObservedRunningTime="2026-03-20 15:43:44.56660588 +0000 UTC m=+1241.529121680" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.592943 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.98902237 podStartE2EDuration="59.592920612s" podCreationTimestamp="2026-03-20 15:42:45 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.747447158 +0000 UTC m=+1203.709962958" lastFinishedPulling="2026-03-20 15:43:43.3513454 +0000 UTC m=+1240.313861200" observedRunningTime="2026-03-20 15:43:44.58393861 +0000 UTC m=+1241.546454420" watchObservedRunningTime="2026-03-20 15:43:44.592920612 +0000 UTC m=+1241.555436412" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.610635 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.626295 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.628429 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.629933 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.632707 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.633021 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.633257 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2468r" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.650467 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.759327 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6559\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-kube-api-access-c6559\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.759803 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.759878 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.760037 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.760121 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-lock\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.760161 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-cache\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.862639 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.862774 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.862832 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-lock\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.862868 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-cache\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.862983 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6559\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-kube-api-access-c6559\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.863075 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: E0320 15:43:44.863384 4779 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:43:44 crc kubenswrapper[4779]: E0320 15:43:44.863410 4779 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:43:44 crc kubenswrapper[4779]: E0320 15:43:44.863463 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift podName:c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b nodeName:}" failed. No retries permitted until 2026-03-20 15:43:45.363441444 +0000 UTC m=+1242.325957244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift") pod "swift-storage-0" (UID: "c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b") : configmap "swift-ring-files" not found Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.863641 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-cache\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.863869 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-lock\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.864438 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.869977 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.879060 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6559\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-kube-api-access-c6559\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:44 crc kubenswrapper[4779]: I0320 15:43:44.883817 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.273849 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s9ls7"] Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.276240 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.280464 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.280551 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.280641 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.310800 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s9ls7"] Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.369795 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.369874 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a65e85e-076d-42e4-88fb-5bb905893173-etc-swift\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.369896 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-ring-data-devices\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.369915 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnpgs\" (UniqueName: \"kubernetes.io/projected/8a65e85e-076d-42e4-88fb-5bb905893173-kube-api-access-tnpgs\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.369946 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-scripts\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.369993 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-combined-ca-bundle\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.370010 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-swiftconf\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.370051 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-dispersionconf\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: E0320 15:43:45.370245 4779 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:43:45 crc kubenswrapper[4779]: E0320 15:43:45.370263 4779 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:43:45 crc kubenswrapper[4779]: E0320 15:43:45.370307 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift podName:c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b nodeName:}" failed. No retries permitted until 2026-03-20 15:43:46.370290995 +0000 UTC m=+1243.332806795 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift") pod "swift-storage-0" (UID: "c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b") : configmap "swift-ring-files" not found Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.471410 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-combined-ca-bundle\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.471460 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-swiftconf\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.471510 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-dispersionconf\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.472084 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a65e85e-076d-42e4-88fb-5bb905893173-etc-swift\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.471605 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a65e85e-076d-42e4-88fb-5bb905893173-etc-swift\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.472390 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-ring-data-devices\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.472424 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnpgs\" (UniqueName: \"kubernetes.io/projected/8a65e85e-076d-42e4-88fb-5bb905893173-kube-api-access-tnpgs\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.472457 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-scripts\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.473103 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-scripts\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.473169 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-ring-data-devices\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.476994 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-dispersionconf\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.477141 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-combined-ca-bundle\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.489427 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnpgs\" (UniqueName: \"kubernetes.io/projected/8a65e85e-076d-42e4-88fb-5bb905893173-kube-api-access-tnpgs\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.489877 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-swiftconf\") pod \"swift-ring-rebalance-s9ls7\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.561021 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9b8436ea-e4a7-4a6b-a6a2-d9282bda9696","Type":"ContainerStarted","Data":"3f64242ba66f289d215f0071bd971f877a76e44dda4bfe61e36af84f6fdc3641"} Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.562880 4779 generic.go:334] "Generic (PLEG): container finished" podID="2295b246-976d-4a92-8d85-ee69b7b056ee" containerID="39807b39f5af4bc7c6b3366e08985a5c54e4cde6de7ce4a5aa21120941259782" exitCode=0 Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.562919 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" event={"ID":"2295b246-976d-4a92-8d85-ee69b7b056ee","Type":"ContainerDied","Data":"39807b39f5af4bc7c6b3366e08985a5c54e4cde6de7ce4a5aa21120941259782"} Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.593858 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=35.59825299 podStartE2EDuration="1m8.593837878s" podCreationTimestamp="2026-03-20 15:42:37 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.508376143 +0000 UTC m=+1203.470891943" lastFinishedPulling="2026-03-20 15:43:39.503961031 +0000 UTC m=+1236.466476831" observedRunningTime="2026-03-20 15:43:45.585533201 +0000 UTC m=+1242.548049011" watchObservedRunningTime="2026-03-20 15:43:45.593837878 +0000 UTC m=+1242.556353678" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.606431 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:43:45 crc kubenswrapper[4779]: I0320 15:43:45.649348 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.152584 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s9ls7"] Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.387915 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:46 crc kubenswrapper[4779]: E0320 15:43:46.388222 4779 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:43:46 crc kubenswrapper[4779]: E0320 15:43:46.388256 4779 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:43:46 crc kubenswrapper[4779]: E0320 15:43:46.388312 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift podName:c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b nodeName:}" failed. No retries permitted until 2026-03-20 15:43:48.388294264 +0000 UTC m=+1245.350810064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift") pod "swift-storage-0" (UID: "c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b") : configmap "swift-ring-files" not found Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.573541 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s9ls7" event={"ID":"8a65e85e-076d-42e4-88fb-5bb905893173","Type":"ContainerStarted","Data":"a280d64c532934498633023b26219fe3e33d969b26c26a39f1b148c5cac6f209"} Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.573993 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.575145 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.578397 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" event={"ID":"2295b246-976d-4a92-8d85-ee69b7b056ee","Type":"ContainerStarted","Data":"459d30523f9e7b2d2559442965aa2b70f3b5ca5c2bce95daf926ddfd5ff94ce2"} Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.578566 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.580649 4779 generic.go:334] "Generic (PLEG): container finished" podID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerID="8e4730bb63686853f3e300c75d90cb2b290843398e8244bebc62b19ff8802e60" exitCode=0 Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.580729 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerDied","Data":"8e4730bb63686853f3e300c75d90cb2b290843398e8244bebc62b19ff8802e60"} Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.585272 4779 generic.go:334] "Generic (PLEG): container finished" podID="0bb35660-3a34-4c16-a943-4375cfe12246" containerID="47b023ea2ffa97e2bc514c0f24943137e162043d6de3e0de4e1f4176dae5aafb" exitCode=0 Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.585517 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0bb35660-3a34-4c16-a943-4375cfe12246","Type":"ContainerDied","Data":"47b023ea2ffa97e2bc514c0f24943137e162043d6de3e0de4e1f4176dae5aafb"} Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.597402 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" podStartSLOduration=3.597378138 podStartE2EDuration="3.597378138s" podCreationTimestamp="2026-03-20 15:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:46.593600014 +0000 UTC m=+1243.556115814" watchObservedRunningTime="2026-03-20 15:43:46.597378138 +0000 UTC m=+1243.559893938" Mar 20 15:43:46 crc kubenswrapper[4779]: I0320 15:43:46.631645 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 15:43:47 crc kubenswrapper[4779]: I0320 15:43:47.596014 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0bb35660-3a34-4c16-a943-4375cfe12246","Type":"ContainerStarted","Data":"0cf2ec2b820427f5b7d958cc86c7827b404d97ac3c491c0107ede4038a9d0434"} Mar 20 15:43:47 crc kubenswrapper[4779]: I0320 15:43:47.693479 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 15:43:47 crc kubenswrapper[4779]: I0320 15:43:47.722286 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371968.13251 podStartE2EDuration="1m8.722266263s" podCreationTimestamp="2026-03-20 15:42:39 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.573400025 +0000 UTC m=+1203.535915825" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:47.624788072 +0000 UTC m=+1244.587303872" watchObservedRunningTime="2026-03-20 15:43:47.722266263 +0000 UTC m=+1244.684782063" Mar 20 15:43:48 crc kubenswrapper[4779]: I0320 15:43:48.443034 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:48 crc kubenswrapper[4779]: E0320 15:43:48.443240 4779 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:43:48 crc kubenswrapper[4779]: E0320 15:43:48.443256 4779 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:43:48 crc kubenswrapper[4779]: E0320 15:43:48.443302 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift podName:c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b nodeName:}" failed. No retries permitted until 2026-03-20 15:43:52.443289305 +0000 UTC m=+1249.405805105 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift") pod "swift-storage-0" (UID: "c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b") : configmap "swift-ring-files" not found Mar 20 15:43:48 crc kubenswrapper[4779]: I0320 15:43:48.653822 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 15:43:48 crc kubenswrapper[4779]: I0320 15:43:48.897022 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9tpp7"] Mar 20 15:43:48 crc kubenswrapper[4779]: I0320 15:43:48.897250 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" podUID="2295b246-976d-4a92-8d85-ee69b7b056ee" containerName="dnsmasq-dns" containerID="cri-o://459d30523f9e7b2d2559442965aa2b70f3b5ca5c2bce95daf926ddfd5ff94ce2" gracePeriod=10 Mar 20 15:43:48 crc kubenswrapper[4779]: I0320 15:43:48.924069 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-ms68f"] Mar 20 15:43:48 crc kubenswrapper[4779]: I0320 15:43:48.934454 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:48 crc kubenswrapper[4779]: I0320 15:43:48.936088 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-ms68f"] Mar 20 15:43:48 crc kubenswrapper[4779]: I0320 15:43:48.941355 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.014086 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-q6sf7"] Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.017432 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.020102 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.049512 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q6sf7"] Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.054504 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.054590 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9jj9\" (UniqueName: \"kubernetes.io/projected/80bdbc60-ed55-43ce-85bc-9b5605395573-kube-api-access-q9jj9\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.054623 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.054659 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-config\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.155890 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-ovs-rundir\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.155955 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llwvp\" (UniqueName: \"kubernetes.io/projected/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-kube-api-access-llwvp\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.156007 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-config\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.156074 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.156125 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-combined-ca-bundle\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.156180 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9jj9\" (UniqueName: \"kubernetes.io/projected/80bdbc60-ed55-43ce-85bc-9b5605395573-kube-api-access-q9jj9\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.156354 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.156379 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-ovn-rundir\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.156410 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.156434 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-config\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.158907 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.158939 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.159311 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-config\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.186942 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9jj9\" (UniqueName: \"kubernetes.io/projected/80bdbc60-ed55-43ce-85bc-9b5605395573-kube-api-access-q9jj9\") pod \"dnsmasq-dns-74f6f696b9-ms68f\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.255501 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-ms68f"] Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.256389 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.257948 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-ovs-rundir\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.257999 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llwvp\" (UniqueName: \"kubernetes.io/projected/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-kube-api-access-llwvp\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.258043 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-config\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.258098 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-combined-ca-bundle\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.258156 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-ovn-rundir\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.258179 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.259780 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-config\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.260008 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-ovs-rundir\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.260383 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-ovn-rundir\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.265012 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.273999 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-combined-ca-bundle\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.286509 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llwvp\" (UniqueName: \"kubernetes.io/projected/f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e-kube-api-access-llwvp\") pod \"ovn-controller-metrics-q6sf7\" (UID: \"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e\") " pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.292552 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-hmz4x"] Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.293934 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.305645 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.307521 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-hmz4x"] Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.337946 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.337981 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.346312 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q6sf7" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.460753 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.461204 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-config\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.461227 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-dns-svc\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.461244 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99j4g\" (UniqueName: \"kubernetes.io/projected/1c14b0cd-bac7-4ca6-b176-1c49747e9132-kube-api-access-99j4g\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.461325 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.563517 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-config\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.563572 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-dns-svc\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.563602 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99j4g\" (UniqueName: \"kubernetes.io/projected/1c14b0cd-bac7-4ca6-b176-1c49747e9132-kube-api-access-99j4g\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.563656 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.563736 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.564909 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-dns-svc\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.564916 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.565066 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.565222 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-config\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.581922 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99j4g\" (UniqueName: \"kubernetes.io/projected/1c14b0cd-bac7-4ca6-b176-1c49747e9132-kube-api-access-99j4g\") pod \"dnsmasq-dns-698758b865-hmz4x\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.618720 4779 generic.go:334] "Generic (PLEG): container finished" podID="2295b246-976d-4a92-8d85-ee69b7b056ee" containerID="459d30523f9e7b2d2559442965aa2b70f3b5ca5c2bce95daf926ddfd5ff94ce2" exitCode=0 Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.619699 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" event={"ID":"2295b246-976d-4a92-8d85-ee69b7b056ee","Type":"ContainerDied","Data":"459d30523f9e7b2d2559442965aa2b70f3b5ca5c2bce95daf926ddfd5ff94ce2"} Mar 20 15:43:49 crc kubenswrapper[4779]: I0320 15:43:49.654626 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.575884 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.651438 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" event={"ID":"2295b246-976d-4a92-8d85-ee69b7b056ee","Type":"ContainerDied","Data":"1575b615123f3c7a85629e1da4161319d0858ec1734ed5c03bdfa5c546393cc5"} Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.651482 4779 scope.go:117] "RemoveContainer" containerID="459d30523f9e7b2d2559442965aa2b70f3b5ca5c2bce95daf926ddfd5ff94ce2" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.651560 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9tpp7" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.695236 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.696550 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.699966 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z5nf\" (UniqueName: \"kubernetes.io/projected/2295b246-976d-4a92-8d85-ee69b7b056ee-kube-api-access-7z5nf\") pod \"2295b246-976d-4a92-8d85-ee69b7b056ee\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.700058 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-dns-svc\") pod \"2295b246-976d-4a92-8d85-ee69b7b056ee\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.700274 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-config\") pod \"2295b246-976d-4a92-8d85-ee69b7b056ee\" (UID: \"2295b246-976d-4a92-8d85-ee69b7b056ee\") " Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.707429 4779 scope.go:117] "RemoveContainer" containerID="39807b39f5af4bc7c6b3366e08985a5c54e4cde6de7ce4a5aa21120941259782" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.715061 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2295b246-976d-4a92-8d85-ee69b7b056ee-kube-api-access-7z5nf" (OuterVolumeSpecName: "kube-api-access-7z5nf") pod "2295b246-976d-4a92-8d85-ee69b7b056ee" (UID: "2295b246-976d-4a92-8d85-ee69b7b056ee"). InnerVolumeSpecName "kube-api-access-7z5nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.724475 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.773495 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-config" (OuterVolumeSpecName: "config") pod "2295b246-976d-4a92-8d85-ee69b7b056ee" (UID: "2295b246-976d-4a92-8d85-ee69b7b056ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.806433 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.806769 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z5nf\" (UniqueName: \"kubernetes.io/projected/2295b246-976d-4a92-8d85-ee69b7b056ee-kube-api-access-7z5nf\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.830485 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2295b246-976d-4a92-8d85-ee69b7b056ee" (UID: "2295b246-976d-4a92-8d85-ee69b7b056ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.891383 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 15:43:50 crc kubenswrapper[4779]: E0320 15:43:50.891865 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2295b246-976d-4a92-8d85-ee69b7b056ee" containerName="init" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.891888 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2295b246-976d-4a92-8d85-ee69b7b056ee" containerName="init" Mar 20 15:43:50 crc kubenswrapper[4779]: E0320 15:43:50.891910 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2295b246-976d-4a92-8d85-ee69b7b056ee" containerName="dnsmasq-dns" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.891918 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2295b246-976d-4a92-8d85-ee69b7b056ee" containerName="dnsmasq-dns" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.892161 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="2295b246-976d-4a92-8d85-ee69b7b056ee" containerName="dnsmasq-dns" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.894825 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.900061 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.900307 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.900341 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.900546 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-cgklz" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.908264 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2295b246-976d-4a92-8d85-ee69b7b056ee-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.939303 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q6sf7"] Mar 20 15:43:50 crc kubenswrapper[4779]: I0320 15:43:50.962180 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.008198 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9tpp7"] Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.009226 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c00c5fd-27bb-4e67-bbfa-374e073d15df-scripts\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.009329 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c00c5fd-27bb-4e67-bbfa-374e073d15df-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.009382 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.012052 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.012100 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c00c5fd-27bb-4e67-bbfa-374e073d15df-config\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.012141 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-465dl\" (UniqueName: \"kubernetes.io/projected/4c00c5fd-27bb-4e67-bbfa-374e073d15df-kube-api-access-465dl\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.012214 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.025201 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9tpp7"] Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.032203 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-ms68f"] Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.040552 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-hmz4x"] Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.114092 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c00c5fd-27bb-4e67-bbfa-374e073d15df-config\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.114177 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-465dl\" (UniqueName: \"kubernetes.io/projected/4c00c5fd-27bb-4e67-bbfa-374e073d15df-kube-api-access-465dl\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.114215 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.114268 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c00c5fd-27bb-4e67-bbfa-374e073d15df-scripts\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.114328 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c00c5fd-27bb-4e67-bbfa-374e073d15df-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.114363 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.114403 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.114862 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c00c5fd-27bb-4e67-bbfa-374e073d15df-config\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.115690 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c00c5fd-27bb-4e67-bbfa-374e073d15df-scripts\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.119120 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.119794 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c00c5fd-27bb-4e67-bbfa-374e073d15df-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.120081 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.125181 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c00c5fd-27bb-4e67-bbfa-374e073d15df-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.134252 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-465dl\" (UniqueName: \"kubernetes.io/projected/4c00c5fd-27bb-4e67-bbfa-374e073d15df-kube-api-access-465dl\") pod \"ovn-northd-0\" (UID: \"4c00c5fd-27bb-4e67-bbfa-374e073d15df\") " pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.227517 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.665824 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q6sf7" event={"ID":"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e","Type":"ContainerStarted","Data":"e755a649d53161beb304ddd24ec0dd2be7a3e8b90f632df1aaed195cf62c03f5"} Mar 20 15:43:51 crc kubenswrapper[4779]: W0320 15:43:51.667723 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80bdbc60_ed55_43ce_85bc_9b5605395573.slice/crio-c87f1306dcb8a07b061f58fa6c6234887cf0431542f8527eae8ba0360189d9aa WatchSource:0}: Error finding container c87f1306dcb8a07b061f58fa6c6234887cf0431542f8527eae8ba0360189d9aa: Status 404 returned error can't find the container with id c87f1306dcb8a07b061f58fa6c6234887cf0431542f8527eae8ba0360189d9aa Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.668726 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s9ls7" event={"ID":"8a65e85e-076d-42e4-88fb-5bb905893173","Type":"ContainerStarted","Data":"3abd11f8ae4085c96b02d576062a414569140a9e57036afebd45eac6ea28a9cb"} Mar 20 15:43:51 crc kubenswrapper[4779]: W0320 15:43:51.669973 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c14b0cd_bac7_4ca6_b176_1c49747e9132.slice/crio-78b6da3d5d846d4524106f88568cd004ccfe9ca8434b3c6e0510699132664a43 WatchSource:0}: Error finding container 78b6da3d5d846d4524106f88568cd004ccfe9ca8434b3c6e0510699132664a43: Status 404 returned error can't find the container with id 78b6da3d5d846d4524106f88568cd004ccfe9ca8434b3c6e0510699132664a43 Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.699038 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-s9ls7" podStartSLOduration=2.371734155 podStartE2EDuration="6.699016245s" podCreationTimestamp="2026-03-20 15:43:45 +0000 UTC" firstStartedPulling="2026-03-20 15:43:46.158046434 +0000 UTC m=+1243.120562234" lastFinishedPulling="2026-03-20 15:43:50.485328524 +0000 UTC m=+1247.447844324" observedRunningTime="2026-03-20 15:43:51.695326533 +0000 UTC m=+1248.657842333" watchObservedRunningTime="2026-03-20 15:43:51.699016245 +0000 UTC m=+1248.661532045" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.758452 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.840079 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2295b246-976d-4a92-8d85-ee69b7b056ee" path="/var/lib/kubelet/pods/2295b246-976d-4a92-8d85-ee69b7b056ee/volumes" Mar 20 15:43:51 crc kubenswrapper[4779]: I0320 15:43:51.929856 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.330481 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-87f8-account-create-update-lbzbd"] Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.333974 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.341396 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.360791 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-87f8-account-create-update-lbzbd"] Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.400357 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.448481 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-g8sgs"] Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.450421 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.488223 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-g8sgs"] Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.489230 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxvpj\" (UniqueName: \"kubernetes.io/projected/91031879-f65a-4ff0-a28a-94856f3db0aa-kube-api-access-gxvpj\") pod \"placement-87f8-account-create-update-lbzbd\" (UID: \"91031879-f65a-4ff0-a28a-94856f3db0aa\") " pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.489311 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.489359 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91031879-f65a-4ff0-a28a-94856f3db0aa-operator-scripts\") pod \"placement-87f8-account-create-update-lbzbd\" (UID: \"91031879-f65a-4ff0-a28a-94856f3db0aa\") " pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:52 crc kubenswrapper[4779]: E0320 15:43:52.489583 4779 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:43:52 crc kubenswrapper[4779]: E0320 15:43:52.489600 4779 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:43:52 crc kubenswrapper[4779]: E0320 15:43:52.489671 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift podName:c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b nodeName:}" failed. No retries permitted until 2026-03-20 15:44:00.489655016 +0000 UTC m=+1257.452170816 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift") pod "swift-storage-0" (UID: "c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b") : configmap "swift-ring-files" not found Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.592097 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa24957f-e52d-40d0-9ed2-801617ddb2d7-operator-scripts\") pod \"placement-db-create-g8sgs\" (UID: \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\") " pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.592382 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxvpj\" (UniqueName: \"kubernetes.io/projected/91031879-f65a-4ff0-a28a-94856f3db0aa-kube-api-access-gxvpj\") pod \"placement-87f8-account-create-update-lbzbd\" (UID: \"91031879-f65a-4ff0-a28a-94856f3db0aa\") " pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.592450 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91031879-f65a-4ff0-a28a-94856f3db0aa-operator-scripts\") pod \"placement-87f8-account-create-update-lbzbd\" (UID: \"91031879-f65a-4ff0-a28a-94856f3db0aa\") " pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.592498 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4zrb\" (UniqueName: \"kubernetes.io/projected/aa24957f-e52d-40d0-9ed2-801617ddb2d7-kube-api-access-v4zrb\") pod \"placement-db-create-g8sgs\" (UID: \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\") " pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.594999 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91031879-f65a-4ff0-a28a-94856f3db0aa-operator-scripts\") pod \"placement-87f8-account-create-update-lbzbd\" (UID: \"91031879-f65a-4ff0-a28a-94856f3db0aa\") " pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.643961 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxvpj\" (UniqueName: \"kubernetes.io/projected/91031879-f65a-4ff0-a28a-94856f3db0aa-kube-api-access-gxvpj\") pod \"placement-87f8-account-create-update-lbzbd\" (UID: \"91031879-f65a-4ff0-a28a-94856f3db0aa\") " pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.679895 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4c00c5fd-27bb-4e67-bbfa-374e073d15df","Type":"ContainerStarted","Data":"12bfd66e5b19b061c0f1617b04e246be8e4d779ef3b7abf011d107c476757f70"} Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.689609 4779 generic.go:334] "Generic (PLEG): container finished" podID="80bdbc60-ed55-43ce-85bc-9b5605395573" containerID="8b9645b4c8aa19fc9b9c2779a7ffb410298607fefe5cd042f5c03deed8b18e2c" exitCode=0 Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.689730 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" event={"ID":"80bdbc60-ed55-43ce-85bc-9b5605395573","Type":"ContainerDied","Data":"8b9645b4c8aa19fc9b9c2779a7ffb410298607fefe5cd042f5c03deed8b18e2c"} Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.690593 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" event={"ID":"80bdbc60-ed55-43ce-85bc-9b5605395573","Type":"ContainerStarted","Data":"c87f1306dcb8a07b061f58fa6c6234887cf0431542f8527eae8ba0360189d9aa"} Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.694378 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa24957f-e52d-40d0-9ed2-801617ddb2d7-operator-scripts\") pod \"placement-db-create-g8sgs\" (UID: \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\") " pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.694498 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4zrb\" (UniqueName: \"kubernetes.io/projected/aa24957f-e52d-40d0-9ed2-801617ddb2d7-kube-api-access-v4zrb\") pod \"placement-db-create-g8sgs\" (UID: \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\") " pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.695434 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa24957f-e52d-40d0-9ed2-801617ddb2d7-operator-scripts\") pod \"placement-db-create-g8sgs\" (UID: \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\") " pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.707006 4779 generic.go:334] "Generic (PLEG): container finished" podID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerID="c93c405ab70db47f0521cdb98a2a3492fe28b54768e6357098cd25942e865203" exitCode=0 Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.707076 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-hmz4x" event={"ID":"1c14b0cd-bac7-4ca6-b176-1c49747e9132","Type":"ContainerDied","Data":"c93c405ab70db47f0521cdb98a2a3492fe28b54768e6357098cd25942e865203"} Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.707181 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-hmz4x" event={"ID":"1c14b0cd-bac7-4ca6-b176-1c49747e9132","Type":"ContainerStarted","Data":"78b6da3d5d846d4524106f88568cd004ccfe9ca8434b3c6e0510699132664a43"} Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.713220 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.720881 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q6sf7" event={"ID":"f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e","Type":"ContainerStarted","Data":"e40237723d8f00ae878c938d6186c5bfac7ec107c7658969ae622744cb8b5bab"} Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.725819 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4zrb\" (UniqueName: \"kubernetes.io/projected/aa24957f-e52d-40d0-9ed2-801617ddb2d7-kube-api-access-v4zrb\") pod \"placement-db-create-g8sgs\" (UID: \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\") " pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.793902 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-q6sf7" podStartSLOduration=4.793881974 podStartE2EDuration="4.793881974s" podCreationTimestamp="2026-03-20 15:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:52.78123071 +0000 UTC m=+1249.743746520" watchObservedRunningTime="2026-03-20 15:43:52.793881974 +0000 UTC m=+1249.756397784" Mar 20 15:43:52 crc kubenswrapper[4779]: I0320 15:43:52.836338 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.130166 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.210571 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-config\") pod \"80bdbc60-ed55-43ce-85bc-9b5605395573\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.210766 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9jj9\" (UniqueName: \"kubernetes.io/projected/80bdbc60-ed55-43ce-85bc-9b5605395573-kube-api-access-q9jj9\") pod \"80bdbc60-ed55-43ce-85bc-9b5605395573\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.210799 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-dns-svc\") pod \"80bdbc60-ed55-43ce-85bc-9b5605395573\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.210817 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-ovsdbserver-nb\") pod \"80bdbc60-ed55-43ce-85bc-9b5605395573\" (UID: \"80bdbc60-ed55-43ce-85bc-9b5605395573\") " Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.220103 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80bdbc60-ed55-43ce-85bc-9b5605395573-kube-api-access-q9jj9" (OuterVolumeSpecName: "kube-api-access-q9jj9") pod "80bdbc60-ed55-43ce-85bc-9b5605395573" (UID: "80bdbc60-ed55-43ce-85bc-9b5605395573"). InnerVolumeSpecName "kube-api-access-q9jj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.239859 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "80bdbc60-ed55-43ce-85bc-9b5605395573" (UID: "80bdbc60-ed55-43ce-85bc-9b5605395573"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.239952 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-config" (OuterVolumeSpecName: "config") pod "80bdbc60-ed55-43ce-85bc-9b5605395573" (UID: "80bdbc60-ed55-43ce-85bc-9b5605395573"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.280463 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80bdbc60-ed55-43ce-85bc-9b5605395573" (UID: "80bdbc60-ed55-43ce-85bc-9b5605395573"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.292152 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.313631 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.313699 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9jj9\" (UniqueName: \"kubernetes.io/projected/80bdbc60-ed55-43ce-85bc-9b5605395573-kube-api-access-q9jj9\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.313715 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.313726 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80bdbc60-ed55-43ce-85bc-9b5605395573-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.365411 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-thctt"] Mar 20 15:43:53 crc kubenswrapper[4779]: E0320 15:43:53.365905 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80bdbc60-ed55-43ce-85bc-9b5605395573" containerName="init" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.365924 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="80bdbc60-ed55-43ce-85bc-9b5605395573" containerName="init" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.366142 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="80bdbc60-ed55-43ce-85bc-9b5605395573" containerName="init" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.366897 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-thctt" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.377278 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-87f8-account-create-update-lbzbd"] Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.395354 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-thctt"] Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.469751 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-1f58-account-create-update-mgl6d"] Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.471510 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.474040 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.482251 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-1f58-account-create-update-mgl6d"] Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.501912 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-g8sgs"] Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.516071 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9cff9d-9412-4738-84a7-df2abae9a5c6-operator-scripts\") pod \"watcher-db-create-thctt\" (UID: \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\") " pod="openstack/watcher-db-create-thctt" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.516643 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96bf2\" (UniqueName: \"kubernetes.io/projected/1c9cff9d-9412-4738-84a7-df2abae9a5c6-kube-api-access-96bf2\") pod \"watcher-db-create-thctt\" (UID: \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\") " pod="openstack/watcher-db-create-thctt" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.619152 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96bf2\" (UniqueName: \"kubernetes.io/projected/1c9cff9d-9412-4738-84a7-df2abae9a5c6-kube-api-access-96bf2\") pod \"watcher-db-create-thctt\" (UID: \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\") " pod="openstack/watcher-db-create-thctt" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.619285 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-operator-scripts\") pod \"watcher-1f58-account-create-update-mgl6d\" (UID: \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\") " pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.619322 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wmv\" (UniqueName: \"kubernetes.io/projected/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-kube-api-access-56wmv\") pod \"watcher-1f58-account-create-update-mgl6d\" (UID: \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\") " pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.619354 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9cff9d-9412-4738-84a7-df2abae9a5c6-operator-scripts\") pod \"watcher-db-create-thctt\" (UID: \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\") " pod="openstack/watcher-db-create-thctt" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.620040 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9cff9d-9412-4738-84a7-df2abae9a5c6-operator-scripts\") pod \"watcher-db-create-thctt\" (UID: \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\") " pod="openstack/watcher-db-create-thctt" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.639879 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96bf2\" (UniqueName: \"kubernetes.io/projected/1c9cff9d-9412-4738-84a7-df2abae9a5c6-kube-api-access-96bf2\") pod \"watcher-db-create-thctt\" (UID: \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\") " pod="openstack/watcher-db-create-thctt" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.719739 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-thctt" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.721233 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wmv\" (UniqueName: \"kubernetes.io/projected/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-kube-api-access-56wmv\") pod \"watcher-1f58-account-create-update-mgl6d\" (UID: \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\") " pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.721466 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-operator-scripts\") pod \"watcher-1f58-account-create-update-mgl6d\" (UID: \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\") " pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.722483 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-operator-scripts\") pod \"watcher-1f58-account-create-update-mgl6d\" (UID: \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\") " pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.733875 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-hmz4x" event={"ID":"1c14b0cd-bac7-4ca6-b176-1c49747e9132","Type":"ContainerStarted","Data":"91c11bb54a8057aea0a9ab54904ee35bce3c21141f649fc9f72262c2d26d65b7"} Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.734737 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.738497 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-g8sgs" event={"ID":"aa24957f-e52d-40d0-9ed2-801617ddb2d7","Type":"ContainerStarted","Data":"a9abf58b9455116eda55bcfdcb0c0c284453bc04a81fd2d13b7f3e02de112a41"} Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.745483 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-87f8-account-create-update-lbzbd" event={"ID":"91031879-f65a-4ff0-a28a-94856f3db0aa","Type":"ContainerStarted","Data":"cac13aed11abd8b93611559eb0adcd3b3c0eeb8027c45a771a7596810263d283"} Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.745530 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-87f8-account-create-update-lbzbd" event={"ID":"91031879-f65a-4ff0-a28a-94856f3db0aa","Type":"ContainerStarted","Data":"c8affd1af923d65763f2743d50a26db3e22553fcfe49a52f9b07364ce1e1fe35"} Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.746674 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wmv\" (UniqueName: \"kubernetes.io/projected/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-kube-api-access-56wmv\") pod \"watcher-1f58-account-create-update-mgl6d\" (UID: \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\") " pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.753700 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-hmz4x" podStartSLOduration=4.753684098 podStartE2EDuration="4.753684098s" podCreationTimestamp="2026-03-20 15:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:53.752902289 +0000 UTC m=+1250.715418089" watchObservedRunningTime="2026-03-20 15:43:53.753684098 +0000 UTC m=+1250.716199898" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.762996 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.763193 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-ms68f" event={"ID":"80bdbc60-ed55-43ce-85bc-9b5605395573","Type":"ContainerDied","Data":"c87f1306dcb8a07b061f58fa6c6234887cf0431542f8527eae8ba0360189d9aa"} Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.763260 4779 scope.go:117] "RemoveContainer" containerID="8b9645b4c8aa19fc9b9c2779a7ffb410298607fefe5cd042f5c03deed8b18e2c" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.774285 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-87f8-account-create-update-lbzbd" podStartSLOduration=1.774264959 podStartE2EDuration="1.774264959s" podCreationTimestamp="2026-03-20 15:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:53.772062934 +0000 UTC m=+1250.734578744" watchObservedRunningTime="2026-03-20 15:43:53.774264959 +0000 UTC m=+1250.736780759" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.862210 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-ms68f"] Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.874995 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.876458 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-ms68f"] Mar 20 15:43:53 crc kubenswrapper[4779]: I0320 15:43:53.938844 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 15:43:54 crc kubenswrapper[4779]: I0320 15:43:54.044325 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 15:43:54 crc kubenswrapper[4779]: I0320 15:43:54.771923 4779 generic.go:334] "Generic (PLEG): container finished" podID="aa24957f-e52d-40d0-9ed2-801617ddb2d7" containerID="f35a9d9d10c366308d30779b5b2fecbb7218a713e4d946d4670f6eb98a512a62" exitCode=0 Mar 20 15:43:54 crc kubenswrapper[4779]: I0320 15:43:54.772038 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-g8sgs" event={"ID":"aa24957f-e52d-40d0-9ed2-801617ddb2d7","Type":"ContainerDied","Data":"f35a9d9d10c366308d30779b5b2fecbb7218a713e4d946d4670f6eb98a512a62"} Mar 20 15:43:54 crc kubenswrapper[4779]: I0320 15:43:54.792695 4779 generic.go:334] "Generic (PLEG): container finished" podID="91031879-f65a-4ff0-a28a-94856f3db0aa" containerID="cac13aed11abd8b93611559eb0adcd3b3c0eeb8027c45a771a7596810263d283" exitCode=0 Mar 20 15:43:54 crc kubenswrapper[4779]: I0320 15:43:54.793752 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-87f8-account-create-update-lbzbd" event={"ID":"91031879-f65a-4ff0-a28a-94856f3db0aa","Type":"ContainerDied","Data":"cac13aed11abd8b93611559eb0adcd3b3c0eeb8027c45a771a7596810263d283"} Mar 20 15:43:55 crc kubenswrapper[4779]: I0320 15:43:55.150132 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:43:55 crc kubenswrapper[4779]: I0320 15:43:55.150201 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:43:55 crc kubenswrapper[4779]: I0320 15:43:55.150244 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:43:55 crc kubenswrapper[4779]: I0320 15:43:55.150897 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fee94c09103d4b2b4bf88d7de3451b803a74dae16bc0136f3fa09b23c09cfe64"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:43:55 crc kubenswrapper[4779]: I0320 15:43:55.150966 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://fee94c09103d4b2b4bf88d7de3451b803a74dae16bc0136f3fa09b23c09cfe64" gracePeriod=600 Mar 20 15:43:55 crc kubenswrapper[4779]: I0320 15:43:55.801340 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="fee94c09103d4b2b4bf88d7de3451b803a74dae16bc0136f3fa09b23c09cfe64" exitCode=0 Mar 20 15:43:55 crc kubenswrapper[4779]: I0320 15:43:55.801524 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"fee94c09103d4b2b4bf88d7de3451b803a74dae16bc0136f3fa09b23c09cfe64"} Mar 20 15:43:55 crc kubenswrapper[4779]: I0320 15:43:55.838254 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80bdbc60-ed55-43ce-85bc-9b5605395573" path="/var/lib/kubelet/pods/80bdbc60-ed55-43ce-85bc-9b5605395573/volumes" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.269157 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-b2xx8"] Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.270800 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b2xx8" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.277137 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-b2xx8"] Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.393753 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skdpf\" (UniqueName: \"kubernetes.io/projected/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-kube-api-access-skdpf\") pod \"glance-db-create-b2xx8\" (UID: \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\") " pod="openstack/glance-db-create-b2xx8" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.393867 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-operator-scripts\") pod \"glance-db-create-b2xx8\" (UID: \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\") " pod="openstack/glance-db-create-b2xx8" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.422618 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6fc0-account-create-update-5x6h6"] Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.423877 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.425823 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.449285 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6fc0-account-create-update-5x6h6"] Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.495591 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skdpf\" (UniqueName: \"kubernetes.io/projected/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-kube-api-access-skdpf\") pod \"glance-db-create-b2xx8\" (UID: \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\") " pod="openstack/glance-db-create-b2xx8" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.495693 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-operator-scripts\") pod \"glance-db-create-b2xx8\" (UID: \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\") " pod="openstack/glance-db-create-b2xx8" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.495751 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44lzs\" (UniqueName: \"kubernetes.io/projected/e06123d4-03bd-44b8-bb15-eb3baccb8c90-kube-api-access-44lzs\") pod \"glance-6fc0-account-create-update-5x6h6\" (UID: \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\") " pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.495790 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e06123d4-03bd-44b8-bb15-eb3baccb8c90-operator-scripts\") pod \"glance-6fc0-account-create-update-5x6h6\" (UID: \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\") " pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.496528 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-operator-scripts\") pod \"glance-db-create-b2xx8\" (UID: \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\") " pod="openstack/glance-db-create-b2xx8" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.539031 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skdpf\" (UniqueName: \"kubernetes.io/projected/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-kube-api-access-skdpf\") pod \"glance-db-create-b2xx8\" (UID: \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\") " pod="openstack/glance-db-create-b2xx8" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.592556 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b2xx8" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.597671 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44lzs\" (UniqueName: \"kubernetes.io/projected/e06123d4-03bd-44b8-bb15-eb3baccb8c90-kube-api-access-44lzs\") pod \"glance-6fc0-account-create-update-5x6h6\" (UID: \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\") " pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.597728 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e06123d4-03bd-44b8-bb15-eb3baccb8c90-operator-scripts\") pod \"glance-6fc0-account-create-update-5x6h6\" (UID: \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\") " pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.598402 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e06123d4-03bd-44b8-bb15-eb3baccb8c90-operator-scripts\") pod \"glance-6fc0-account-create-update-5x6h6\" (UID: \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\") " pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.621888 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44lzs\" (UniqueName: \"kubernetes.io/projected/e06123d4-03bd-44b8-bb15-eb3baccb8c90-kube-api-access-44lzs\") pod \"glance-6fc0-account-create-update-5x6h6\" (UID: \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\") " pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:43:56 crc kubenswrapper[4779]: I0320 15:43:56.748581 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.266601 4779 scope.go:117] "RemoveContainer" containerID="d329625e139412c780d56ee2166e7046f52ce6cce2fdaf917fadbaf979acbe6a" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.452766 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.484593 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.509928 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa24957f-e52d-40d0-9ed2-801617ddb2d7-operator-scripts\") pod \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\" (UID: \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\") " Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.510040 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4zrb\" (UniqueName: \"kubernetes.io/projected/aa24957f-e52d-40d0-9ed2-801617ddb2d7-kube-api-access-v4zrb\") pod \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\" (UID: \"aa24957f-e52d-40d0-9ed2-801617ddb2d7\") " Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.515859 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa24957f-e52d-40d0-9ed2-801617ddb2d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa24957f-e52d-40d0-9ed2-801617ddb2d7" (UID: "aa24957f-e52d-40d0-9ed2-801617ddb2d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.529303 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa24957f-e52d-40d0-9ed2-801617ddb2d7-kube-api-access-v4zrb" (OuterVolumeSpecName: "kube-api-access-v4zrb") pod "aa24957f-e52d-40d0-9ed2-801617ddb2d7" (UID: "aa24957f-e52d-40d0-9ed2-801617ddb2d7"). InnerVolumeSpecName "kube-api-access-v4zrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.617821 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxvpj\" (UniqueName: \"kubernetes.io/projected/91031879-f65a-4ff0-a28a-94856f3db0aa-kube-api-access-gxvpj\") pod \"91031879-f65a-4ff0-a28a-94856f3db0aa\" (UID: \"91031879-f65a-4ff0-a28a-94856f3db0aa\") " Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.617988 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91031879-f65a-4ff0-a28a-94856f3db0aa-operator-scripts\") pod \"91031879-f65a-4ff0-a28a-94856f3db0aa\" (UID: \"91031879-f65a-4ff0-a28a-94856f3db0aa\") " Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.618380 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4zrb\" (UniqueName: \"kubernetes.io/projected/aa24957f-e52d-40d0-9ed2-801617ddb2d7-kube-api-access-v4zrb\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.618407 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa24957f-e52d-40d0-9ed2-801617ddb2d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.618427 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91031879-f65a-4ff0-a28a-94856f3db0aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91031879-f65a-4ff0-a28a-94856f3db0aa" (UID: "91031879-f65a-4ff0-a28a-94856f3db0aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.622016 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91031879-f65a-4ff0-a28a-94856f3db0aa-kube-api-access-gxvpj" (OuterVolumeSpecName: "kube-api-access-gxvpj") pod "91031879-f65a-4ff0-a28a-94856f3db0aa" (UID: "91031879-f65a-4ff0-a28a-94856f3db0aa"). InnerVolumeSpecName "kube-api-access-gxvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.719615 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxvpj\" (UniqueName: \"kubernetes.io/projected/91031879-f65a-4ff0-a28a-94856f3db0aa-kube-api-access-gxvpj\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.719651 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91031879-f65a-4ff0-a28a-94856f3db0aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.906913 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerStarted","Data":"856c3e84fd751f08c78702db57717d11dc490e7ce8e32e54d378540c7eefb0b7"} Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.974285 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-thctt"] Mar 20 15:43:57 crc kubenswrapper[4779]: I0320 15:43:57.998181 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-1f58-account-create-update-mgl6d"] Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.039631 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-g8sgs" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.049709 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-b2xx8"] Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.049774 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-g8sgs" event={"ID":"aa24957f-e52d-40d0-9ed2-801617ddb2d7","Type":"ContainerDied","Data":"a9abf58b9455116eda55bcfdcb0c0c284453bc04a81fd2d13b7f3e02de112a41"} Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.049801 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9abf58b9455116eda55bcfdcb0c0c284453bc04a81fd2d13b7f3e02de112a41" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.063129 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6fc0-account-create-update-5x6h6"] Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.063175 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vlnhf"] Mar 20 15:43:58 crc kubenswrapper[4779]: E0320 15:43:58.063484 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa24957f-e52d-40d0-9ed2-801617ddb2d7" containerName="mariadb-database-create" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.063497 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa24957f-e52d-40d0-9ed2-801617ddb2d7" containerName="mariadb-database-create" Mar 20 15:43:58 crc kubenswrapper[4779]: E0320 15:43:58.063509 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91031879-f65a-4ff0-a28a-94856f3db0aa" containerName="mariadb-account-create-update" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.063514 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="91031879-f65a-4ff0-a28a-94856f3db0aa" containerName="mariadb-account-create-update" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.065333 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="91031879-f65a-4ff0-a28a-94856f3db0aa" containerName="mariadb-account-create-update" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.065357 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa24957f-e52d-40d0-9ed2-801617ddb2d7" containerName="mariadb-database-create" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.066730 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vlnhf" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.072507 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.073009 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vlnhf"] Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.097675 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-87f8-account-create-update-lbzbd" event={"ID":"91031879-f65a-4ff0-a28a-94856f3db0aa","Type":"ContainerDied","Data":"c8affd1af923d65763f2743d50a26db3e22553fcfe49a52f9b07364ce1e1fe35"} Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.097719 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8affd1af923d65763f2743d50a26db3e22553fcfe49a52f9b07364ce1e1fe35" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.097801 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-87f8-account-create-update-lbzbd" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.112420 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4c00c5fd-27bb-4e67-bbfa-374e073d15df","Type":"ContainerStarted","Data":"0e879bceecbae615907fc0d01e3541d4500b61932945ac7f4279c85dcab111ef"} Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.126908 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"7b4abfdbc7440efebe366f165047da802f1778ef342a7861f64638e51f96112d"} Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.157041 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncgxv\" (UniqueName: \"kubernetes.io/projected/807b267f-236d-4816-ac99-b0c6137d3bda-kube-api-access-ncgxv\") pod \"root-account-create-update-vlnhf\" (UID: \"807b267f-236d-4816-ac99-b0c6137d3bda\") " pod="openstack/root-account-create-update-vlnhf" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.157149 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/807b267f-236d-4816-ac99-b0c6137d3bda-operator-scripts\") pod \"root-account-create-update-vlnhf\" (UID: \"807b267f-236d-4816-ac99-b0c6137d3bda\") " pod="openstack/root-account-create-update-vlnhf" Mar 20 15:43:58 crc kubenswrapper[4779]: W0320 15:43:58.205597 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c9cff9d_9412_4738_84a7_df2abae9a5c6.slice/crio-9028bd3adc5905ea993af5cd49b1a76ed7e1273c199e376718f2c81c98b38899 WatchSource:0}: Error finding container 9028bd3adc5905ea993af5cd49b1a76ed7e1273c199e376718f2c81c98b38899: Status 404 returned error can't find the container with id 9028bd3adc5905ea993af5cd49b1a76ed7e1273c199e376718f2c81c98b38899 Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.259368 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncgxv\" (UniqueName: \"kubernetes.io/projected/807b267f-236d-4816-ac99-b0c6137d3bda-kube-api-access-ncgxv\") pod \"root-account-create-update-vlnhf\" (UID: \"807b267f-236d-4816-ac99-b0c6137d3bda\") " pod="openstack/root-account-create-update-vlnhf" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.259563 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/807b267f-236d-4816-ac99-b0c6137d3bda-operator-scripts\") pod \"root-account-create-update-vlnhf\" (UID: \"807b267f-236d-4816-ac99-b0c6137d3bda\") " pod="openstack/root-account-create-update-vlnhf" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.260634 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/807b267f-236d-4816-ac99-b0c6137d3bda-operator-scripts\") pod \"root-account-create-update-vlnhf\" (UID: \"807b267f-236d-4816-ac99-b0c6137d3bda\") " pod="openstack/root-account-create-update-vlnhf" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.283532 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncgxv\" (UniqueName: \"kubernetes.io/projected/807b267f-236d-4816-ac99-b0c6137d3bda-kube-api-access-ncgxv\") pod \"root-account-create-update-vlnhf\" (UID: \"807b267f-236d-4816-ac99-b0c6137d3bda\") " pod="openstack/root-account-create-update-vlnhf" Mar 20 15:43:58 crc kubenswrapper[4779]: I0320 15:43:58.546134 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vlnhf" Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.136712 4779 generic.go:334] "Generic (PLEG): container finished" podID="569982f0-2f9f-4e5c-8cc5-ea78916cda2c" containerID="62e7826c3ec39d2b90569426cf09df175d1630bd7138e093b85f3cb9b9c814ea" exitCode=0 Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.136759 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1f58-account-create-update-mgl6d" event={"ID":"569982f0-2f9f-4e5c-8cc5-ea78916cda2c","Type":"ContainerDied","Data":"62e7826c3ec39d2b90569426cf09df175d1630bd7138e093b85f3cb9b9c814ea"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.137405 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1f58-account-create-update-mgl6d" event={"ID":"569982f0-2f9f-4e5c-8cc5-ea78916cda2c","Type":"ContainerStarted","Data":"217bcdf84a1929c60efb0f60e8ca74e4d3945f9b08b61cc084b61d8deb30caf1"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.139319 4779 generic.go:334] "Generic (PLEG): container finished" podID="1c9cff9d-9412-4738-84a7-df2abae9a5c6" containerID="aad5ee9afcaca2ed8fc734ebf3ad22218d0376cef5b35e3c1191715887982e71" exitCode=0 Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.139391 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-thctt" event={"ID":"1c9cff9d-9412-4738-84a7-df2abae9a5c6","Type":"ContainerDied","Data":"aad5ee9afcaca2ed8fc734ebf3ad22218d0376cef5b35e3c1191715887982e71"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.139420 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-thctt" event={"ID":"1c9cff9d-9412-4738-84a7-df2abae9a5c6","Type":"ContainerStarted","Data":"9028bd3adc5905ea993af5cd49b1a76ed7e1273c199e376718f2c81c98b38899"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.141541 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4c00c5fd-27bb-4e67-bbfa-374e073d15df","Type":"ContainerStarted","Data":"fd1f018aff63cfad2c885cd69d61afeffcaf98eea21409adc0efcd7f9a7ba391"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.141648 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.145509 4779 generic.go:334] "Generic (PLEG): container finished" podID="e06123d4-03bd-44b8-bb15-eb3baccb8c90" containerID="97cffcfd633899bd083a26cf432dcca771a912284fd2b178eaa78d5cd12e41cd" exitCode=0 Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.145615 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6fc0-account-create-update-5x6h6" event={"ID":"e06123d4-03bd-44b8-bb15-eb3baccb8c90","Type":"ContainerDied","Data":"97cffcfd633899bd083a26cf432dcca771a912284fd2b178eaa78d5cd12e41cd"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.145668 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6fc0-account-create-update-5x6h6" event={"ID":"e06123d4-03bd-44b8-bb15-eb3baccb8c90","Type":"ContainerStarted","Data":"ba0ccde3459ad45d250ce6797a43274d028a254079d0529cd37b66dddc6513d4"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.147878 4779 generic.go:334] "Generic (PLEG): container finished" podID="9e05bbbd-ea13-4bd8-bf92-930e44dd8e91" containerID="8b6ee097cd8d183960d8372e42c44c5d0bf52f5f4df507889a27a1fdbcc44c8a" exitCode=0 Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.147933 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b2xx8" event={"ID":"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91","Type":"ContainerDied","Data":"8b6ee097cd8d183960d8372e42c44c5d0bf52f5f4df507889a27a1fdbcc44c8a"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.147970 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b2xx8" event={"ID":"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91","Type":"ContainerStarted","Data":"718264e9424e45953ed28013090bbe31c0efef8124d2e54a46b373360c223177"} Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.246790 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vlnhf"] Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.257834 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.29240558 podStartE2EDuration="9.257811763s" podCreationTimestamp="2026-03-20 15:43:50 +0000 UTC" firstStartedPulling="2026-03-20 15:43:52.435924451 +0000 UTC m=+1249.398440251" lastFinishedPulling="2026-03-20 15:43:57.401330634 +0000 UTC m=+1254.363846434" observedRunningTime="2026-03-20 15:43:59.246751738 +0000 UTC m=+1256.209267538" watchObservedRunningTime="2026-03-20 15:43:59.257811763 +0000 UTC m=+1256.220327563" Mar 20 15:43:59 crc kubenswrapper[4779]: W0320 15:43:59.273349 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807b267f_236d_4816_ac99_b0c6137d3bda.slice/crio-945c9ea3f183b759ac28731130db60ade0a0242c74bff5d88cd09b66392869bb WatchSource:0}: Error finding container 945c9ea3f183b759ac28731130db60ade0a0242c74bff5d88cd09b66392869bb: Status 404 returned error can't find the container with id 945c9ea3f183b759ac28731130db60ade0a0242c74bff5d88cd09b66392869bb Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.657267 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.706206 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dp74h"] Mar 20 15:43:59 crc kubenswrapper[4779]: I0320 15:43:59.706454 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" containerName="dnsmasq-dns" containerID="cri-o://2e34e237c316ed1ce8b188a02c2057939f604f87626d200c86bed0b0f8f6cbb2" gracePeriod=10 Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.163474 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567024-tqhpt"] Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.165772 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.168376 4779 generic.go:334] "Generic (PLEG): container finished" podID="8a65e85e-076d-42e4-88fb-5bb905893173" containerID="3abd11f8ae4085c96b02d576062a414569140a9e57036afebd45eac6ea28a9cb" exitCode=0 Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.168455 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s9ls7" event={"ID":"8a65e85e-076d-42e4-88fb-5bb905893173","Type":"ContainerDied","Data":"3abd11f8ae4085c96b02d576062a414569140a9e57036afebd45eac6ea28a9cb"} Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.169255 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.169324 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.169352 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.172659 4779 generic.go:334] "Generic (PLEG): container finished" podID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" containerID="2e34e237c316ed1ce8b188a02c2057939f604f87626d200c86bed0b0f8f6cbb2" exitCode=0 Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.172761 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" event={"ID":"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd","Type":"ContainerDied","Data":"2e34e237c316ed1ce8b188a02c2057939f604f87626d200c86bed0b0f8f6cbb2"} Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.174656 4779 generic.go:334] "Generic (PLEG): container finished" podID="807b267f-236d-4816-ac99-b0c6137d3bda" containerID="bfb335566f42e0303c8f99f7c83d08913087b0fdfc66b0529a5c94bd4b1f70c6" exitCode=0 Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.175495 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vlnhf" event={"ID":"807b267f-236d-4816-ac99-b0c6137d3bda","Type":"ContainerDied","Data":"bfb335566f42e0303c8f99f7c83d08913087b0fdfc66b0529a5c94bd4b1f70c6"} Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.175527 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vlnhf" event={"ID":"807b267f-236d-4816-ac99-b0c6137d3bda","Type":"ContainerStarted","Data":"945c9ea3f183b759ac28731130db60ade0a0242c74bff5d88cd09b66392869bb"} Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.191482 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-tqhpt"] Mar 20 15:44:00 crc kubenswrapper[4779]: I0320 15:44:00.294552 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb28g\" (UniqueName: \"kubernetes.io/projected/c2558371-6693-4c25-bb36-d1688f299d44-kube-api-access-tb28g\") pod \"auto-csr-approver-29567024-tqhpt\" (UID: \"c2558371-6693-4c25-bb36-d1688f299d44\") " pod="openshift-infra/auto-csr-approver-29567024-tqhpt" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.396558 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb28g\" (UniqueName: \"kubernetes.io/projected/c2558371-6693-4c25-bb36-d1688f299d44-kube-api-access-tb28g\") pod \"auto-csr-approver-29567024-tqhpt\" (UID: \"c2558371-6693-4c25-bb36-d1688f299d44\") " pod="openshift-infra/auto-csr-approver-29567024-tqhpt" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.415377 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.419534 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb28g\" (UniqueName: \"kubernetes.io/projected/c2558371-6693-4c25-bb36-d1688f299d44-kube-api-access-tb28g\") pod \"auto-csr-approver-29567024-tqhpt\" (UID: \"c2558371-6693-4c25-bb36-d1688f299d44\") " pod="openshift-infra/auto-csr-approver-29567024-tqhpt" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.486805 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.500871 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-dns-svc\") pod \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.500943 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bczzl\" (UniqueName: \"kubernetes.io/projected/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-kube-api-access-bczzl\") pod \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.500973 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-config\") pod \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\" (UID: \"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.501279 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.513854 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-kube-api-access-bczzl" (OuterVolumeSpecName: "kube-api-access-bczzl") pod "b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" (UID: "b6e253fe-fec4-49b3-a9a6-22a7da20f3bd"). InnerVolumeSpecName "kube-api-access-bczzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.514824 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b-etc-swift\") pod \"swift-storage-0\" (UID: \"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b\") " pod="openstack/swift-storage-0" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.565678 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" (UID: "b6e253fe-fec4-49b3-a9a6-22a7da20f3bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.582209 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.589710 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-config" (OuterVolumeSpecName: "config") pod "b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" (UID: "b6e253fe-fec4-49b3-a9a6-22a7da20f3bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.603166 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.603235 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bczzl\" (UniqueName: \"kubernetes.io/projected/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-kube-api-access-bczzl\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.603251 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.641680 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-thctt" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.704245 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9cff9d-9412-4738-84a7-df2abae9a5c6-operator-scripts\") pod \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\" (UID: \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.704377 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96bf2\" (UniqueName: \"kubernetes.io/projected/1c9cff9d-9412-4738-84a7-df2abae9a5c6-kube-api-access-96bf2\") pod \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\" (UID: \"1c9cff9d-9412-4738-84a7-df2abae9a5c6\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.708304 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c9cff9d-9412-4738-84a7-df2abae9a5c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c9cff9d-9412-4738-84a7-df2abae9a5c6" (UID: "1c9cff9d-9412-4738-84a7-df2abae9a5c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.710164 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c9cff9d-9412-4738-84a7-df2abae9a5c6-kube-api-access-96bf2" (OuterVolumeSpecName: "kube-api-access-96bf2") pod "1c9cff9d-9412-4738-84a7-df2abae9a5c6" (UID: "1c9cff9d-9412-4738-84a7-df2abae9a5c6"). InnerVolumeSpecName "kube-api-access-96bf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.807319 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9cff9d-9412-4738-84a7-df2abae9a5c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:00.807367 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96bf2\" (UniqueName: \"kubernetes.io/projected/1c9cff9d-9412-4738-84a7-df2abae9a5c6-kube-api-access-96bf2\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.187289 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-thctt" event={"ID":"1c9cff9d-9412-4738-84a7-df2abae9a5c6","Type":"ContainerDied","Data":"9028bd3adc5905ea993af5cd49b1a76ed7e1273c199e376718f2c81c98b38899"} Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.187573 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9028bd3adc5905ea993af5cd49b1a76ed7e1273c199e376718f2c81c98b38899" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.187521 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-thctt" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.190852 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" event={"ID":"b6e253fe-fec4-49b3-a9a6-22a7da20f3bd","Type":"ContainerDied","Data":"625734d28371900a5166885e249b9dde8ad9d23229e39c193792ad6bb4f1b3d3"} Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.190904 4779 scope.go:117] "RemoveContainer" containerID="2e34e237c316ed1ce8b188a02c2057939f604f87626d200c86bed0b0f8f6cbb2" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.191021 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dp74h" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.206450 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerStarted","Data":"882d26ce6c8eda7e63bf062c62f771e9ccd1f4cd79ab91d0649e025e119198b5"} Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.242702 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dp74h"] Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.250012 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dp74h"] Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.288684 4779 scope.go:117] "RemoveContainer" containerID="fed20c0f0597299f0423fd2aae2f41d96a06471814a8ecdcdf2df637e5d427f5" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.464216 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.470483 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.486823 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b2xx8" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.527415 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56wmv\" (UniqueName: \"kubernetes.io/projected/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-kube-api-access-56wmv\") pod \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\" (UID: \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.527651 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-operator-scripts\") pod \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\" (UID: \"569982f0-2f9f-4e5c-8cc5-ea78916cda2c\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.528700 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "569982f0-2f9f-4e5c-8cc5-ea78916cda2c" (UID: "569982f0-2f9f-4e5c-8cc5-ea78916cda2c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.551316 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-kube-api-access-56wmv" (OuterVolumeSpecName: "kube-api-access-56wmv") pod "569982f0-2f9f-4e5c-8cc5-ea78916cda2c" (UID: "569982f0-2f9f-4e5c-8cc5-ea78916cda2c"). InnerVolumeSpecName "kube-api-access-56wmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.629163 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e06123d4-03bd-44b8-bb15-eb3baccb8c90-operator-scripts\") pod \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\" (UID: \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.629314 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skdpf\" (UniqueName: \"kubernetes.io/projected/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-kube-api-access-skdpf\") pod \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\" (UID: \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.629407 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-operator-scripts\") pod \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\" (UID: \"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.629515 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44lzs\" (UniqueName: \"kubernetes.io/projected/e06123d4-03bd-44b8-bb15-eb3baccb8c90-kube-api-access-44lzs\") pod \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\" (UID: \"e06123d4-03bd-44b8-bb15-eb3baccb8c90\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.629945 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.629968 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56wmv\" (UniqueName: \"kubernetes.io/projected/569982f0-2f9f-4e5c-8cc5-ea78916cda2c-kube-api-access-56wmv\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.630201 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e05bbbd-ea13-4bd8-bf92-930e44dd8e91" (UID: "9e05bbbd-ea13-4bd8-bf92-930e44dd8e91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.630280 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06123d4-03bd-44b8-bb15-eb3baccb8c90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e06123d4-03bd-44b8-bb15-eb3baccb8c90" (UID: "e06123d4-03bd-44b8-bb15-eb3baccb8c90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.634450 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06123d4-03bd-44b8-bb15-eb3baccb8c90-kube-api-access-44lzs" (OuterVolumeSpecName: "kube-api-access-44lzs") pod "e06123d4-03bd-44b8-bb15-eb3baccb8c90" (UID: "e06123d4-03bd-44b8-bb15-eb3baccb8c90"). InnerVolumeSpecName "kube-api-access-44lzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.634786 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-kube-api-access-skdpf" (OuterVolumeSpecName: "kube-api-access-skdpf") pod "9e05bbbd-ea13-4bd8-bf92-930e44dd8e91" (UID: "9e05bbbd-ea13-4bd8-bf92-930e44dd8e91"). InnerVolumeSpecName "kube-api-access-skdpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.651388 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vlnhf" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.692576 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.731158 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncgxv\" (UniqueName: \"kubernetes.io/projected/807b267f-236d-4816-ac99-b0c6137d3bda-kube-api-access-ncgxv\") pod \"807b267f-236d-4816-ac99-b0c6137d3bda\" (UID: \"807b267f-236d-4816-ac99-b0c6137d3bda\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.731237 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/807b267f-236d-4816-ac99-b0c6137d3bda-operator-scripts\") pod \"807b267f-236d-4816-ac99-b0c6137d3bda\" (UID: \"807b267f-236d-4816-ac99-b0c6137d3bda\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.731706 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e06123d4-03bd-44b8-bb15-eb3baccb8c90-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.731717 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skdpf\" (UniqueName: \"kubernetes.io/projected/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-kube-api-access-skdpf\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.731728 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.731736 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44lzs\" (UniqueName: \"kubernetes.io/projected/e06123d4-03bd-44b8-bb15-eb3baccb8c90-kube-api-access-44lzs\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.732389 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/807b267f-236d-4816-ac99-b0c6137d3bda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "807b267f-236d-4816-ac99-b0c6137d3bda" (UID: "807b267f-236d-4816-ac99-b0c6137d3bda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.735458 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807b267f-236d-4816-ac99-b0c6137d3bda-kube-api-access-ncgxv" (OuterVolumeSpecName: "kube-api-access-ncgxv") pod "807b267f-236d-4816-ac99-b0c6137d3bda" (UID: "807b267f-236d-4816-ac99-b0c6137d3bda"). InnerVolumeSpecName "kube-api-access-ncgxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.819433 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" path="/var/lib/kubelet/pods/b6e253fe-fec4-49b3-a9a6-22a7da20f3bd/volumes" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833133 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnpgs\" (UniqueName: \"kubernetes.io/projected/8a65e85e-076d-42e4-88fb-5bb905893173-kube-api-access-tnpgs\") pod \"8a65e85e-076d-42e4-88fb-5bb905893173\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833254 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-combined-ca-bundle\") pod \"8a65e85e-076d-42e4-88fb-5bb905893173\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833296 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-swiftconf\") pod \"8a65e85e-076d-42e4-88fb-5bb905893173\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833370 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-scripts\") pod \"8a65e85e-076d-42e4-88fb-5bb905893173\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833475 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-dispersionconf\") pod \"8a65e85e-076d-42e4-88fb-5bb905893173\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833514 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-ring-data-devices\") pod \"8a65e85e-076d-42e4-88fb-5bb905893173\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833535 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a65e85e-076d-42e4-88fb-5bb905893173-etc-swift\") pod \"8a65e85e-076d-42e4-88fb-5bb905893173\" (UID: \"8a65e85e-076d-42e4-88fb-5bb905893173\") " Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833966 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncgxv\" (UniqueName: \"kubernetes.io/projected/807b267f-236d-4816-ac99-b0c6137d3bda-kube-api-access-ncgxv\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.833986 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/807b267f-236d-4816-ac99-b0c6137d3bda-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.834556 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8a65e85e-076d-42e4-88fb-5bb905893173" (UID: "8a65e85e-076d-42e4-88fb-5bb905893173"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.836250 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a65e85e-076d-42e4-88fb-5bb905893173-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8a65e85e-076d-42e4-88fb-5bb905893173" (UID: "8a65e85e-076d-42e4-88fb-5bb905893173"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.838155 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a65e85e-076d-42e4-88fb-5bb905893173-kube-api-access-tnpgs" (OuterVolumeSpecName: "kube-api-access-tnpgs") pod "8a65e85e-076d-42e4-88fb-5bb905893173" (UID: "8a65e85e-076d-42e4-88fb-5bb905893173"). InnerVolumeSpecName "kube-api-access-tnpgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.856417 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8a65e85e-076d-42e4-88fb-5bb905893173" (UID: "8a65e85e-076d-42e4-88fb-5bb905893173"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.860686 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a65e85e-076d-42e4-88fb-5bb905893173" (UID: "8a65e85e-076d-42e4-88fb-5bb905893173"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.868721 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-scripts" (OuterVolumeSpecName: "scripts") pod "8a65e85e-076d-42e4-88fb-5bb905893173" (UID: "8a65e85e-076d-42e4-88fb-5bb905893173"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.901148 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8a65e85e-076d-42e4-88fb-5bb905893173" (UID: "8a65e85e-076d-42e4-88fb-5bb905893173"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.935521 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.935568 4779 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.935583 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.935597 4779 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a65e85e-076d-42e4-88fb-5bb905893173-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.935610 4779 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a65e85e-076d-42e4-88fb-5bb905893173-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.935624 4779 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a65e85e-076d-42e4-88fb-5bb905893173-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.935637 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnpgs\" (UniqueName: \"kubernetes.io/projected/8a65e85e-076d-42e4-88fb-5bb905893173-kube-api-access-tnpgs\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.940720 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-tqhpt"] Mar 20 15:44:01 crc kubenswrapper[4779]: I0320 15:44:01.966329 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044243 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2mg9r"] Mar 20 15:44:02 crc kubenswrapper[4779]: E0320 15:44:02.044643 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807b267f-236d-4816-ac99-b0c6137d3bda" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044666 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="807b267f-236d-4816-ac99-b0c6137d3bda" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: E0320 15:44:02.044690 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a65e85e-076d-42e4-88fb-5bb905893173" containerName="swift-ring-rebalance" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044716 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a65e85e-076d-42e4-88fb-5bb905893173" containerName="swift-ring-rebalance" Mar 20 15:44:02 crc kubenswrapper[4779]: E0320 15:44:02.044739 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569982f0-2f9f-4e5c-8cc5-ea78916cda2c" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044748 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="569982f0-2f9f-4e5c-8cc5-ea78916cda2c" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: E0320 15:44:02.044759 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" containerName="init" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044766 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" containerName="init" Mar 20 15:44:02 crc kubenswrapper[4779]: E0320 15:44:02.044783 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e05bbbd-ea13-4bd8-bf92-930e44dd8e91" containerName="mariadb-database-create" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044791 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e05bbbd-ea13-4bd8-bf92-930e44dd8e91" containerName="mariadb-database-create" Mar 20 15:44:02 crc kubenswrapper[4779]: E0320 15:44:02.044801 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9cff9d-9412-4738-84a7-df2abae9a5c6" containerName="mariadb-database-create" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044808 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9cff9d-9412-4738-84a7-df2abae9a5c6" containerName="mariadb-database-create" Mar 20 15:44:02 crc kubenswrapper[4779]: E0320 15:44:02.044817 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06123d4-03bd-44b8-bb15-eb3baccb8c90" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044824 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06123d4-03bd-44b8-bb15-eb3baccb8c90" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: E0320 15:44:02.044836 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" containerName="dnsmasq-dns" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.044949 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" containerName="dnsmasq-dns" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.045178 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a65e85e-076d-42e4-88fb-5bb905893173" containerName="swift-ring-rebalance" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.045200 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e05bbbd-ea13-4bd8-bf92-930e44dd8e91" containerName="mariadb-database-create" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.045213 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c9cff9d-9412-4738-84a7-df2abae9a5c6" containerName="mariadb-database-create" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.045227 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="807b267f-236d-4816-ac99-b0c6137d3bda" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.045236 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="569982f0-2f9f-4e5c-8cc5-ea78916cda2c" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.045249 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e253fe-fec4-49b3-a9a6-22a7da20f3bd" containerName="dnsmasq-dns" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.045257 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06123d4-03bd-44b8-bb15-eb3baccb8c90" containerName="mariadb-account-create-update" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.045873 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.058658 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2mg9r"] Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.139658 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32fe88a3-1d3d-4863-9abd-160e0537aff3-operator-scripts\") pod \"keystone-db-create-2mg9r\" (UID: \"32fe88a3-1d3d-4863-9abd-160e0537aff3\") " pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.140001 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qnl\" (UniqueName: \"kubernetes.io/projected/32fe88a3-1d3d-4863-9abd-160e0537aff3-kube-api-access-m4qnl\") pod \"keystone-db-create-2mg9r\" (UID: \"32fe88a3-1d3d-4863-9abd-160e0537aff3\") " pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.156508 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-df14-account-create-update-nfhsc"] Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.157719 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.160907 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.165572 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-df14-account-create-update-nfhsc"] Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.217248 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6fc0-account-create-update-5x6h6" event={"ID":"e06123d4-03bd-44b8-bb15-eb3baccb8c90","Type":"ContainerDied","Data":"ba0ccde3459ad45d250ce6797a43274d028a254079d0529cd37b66dddc6513d4"} Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.217294 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0ccde3459ad45d250ce6797a43274d028a254079d0529cd37b66dddc6513d4" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.217262 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6fc0-account-create-update-5x6h6" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.218777 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s9ls7" event={"ID":"8a65e85e-076d-42e4-88fb-5bb905893173","Type":"ContainerDied","Data":"a280d64c532934498633023b26219fe3e33d969b26c26a39f1b148c5cac6f209"} Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.218798 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s9ls7" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.218826 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a280d64c532934498633023b26219fe3e33d969b26c26a39f1b148c5cac6f209" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.224308 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vlnhf" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.224316 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vlnhf" event={"ID":"807b267f-236d-4816-ac99-b0c6137d3bda","Type":"ContainerDied","Data":"945c9ea3f183b759ac28731130db60ade0a0242c74bff5d88cd09b66392869bb"} Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.224386 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="945c9ea3f183b759ac28731130db60ade0a0242c74bff5d88cd09b66392869bb" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.229137 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b2xx8" event={"ID":"9e05bbbd-ea13-4bd8-bf92-930e44dd8e91","Type":"ContainerDied","Data":"718264e9424e45953ed28013090bbe31c0efef8124d2e54a46b373360c223177"} Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.229177 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="718264e9424e45953ed28013090bbe31c0efef8124d2e54a46b373360c223177" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.229220 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b2xx8" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.230269 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"2ddbe3276b189236c7a151d6e516b6114079e6f93ea98ab52b542bee8eef6a6a"} Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.234087 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1f58-account-create-update-mgl6d" event={"ID":"569982f0-2f9f-4e5c-8cc5-ea78916cda2c","Type":"ContainerDied","Data":"217bcdf84a1929c60efb0f60e8ca74e4d3945f9b08b61cc084b61d8deb30caf1"} Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.234138 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217bcdf84a1929c60efb0f60e8ca74e4d3945f9b08b61cc084b61d8deb30caf1" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.234150 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1f58-account-create-update-mgl6d" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.245164 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32fe88a3-1d3d-4863-9abd-160e0537aff3-operator-scripts\") pod \"keystone-db-create-2mg9r\" (UID: \"32fe88a3-1d3d-4863-9abd-160e0537aff3\") " pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.245275 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qnl\" (UniqueName: \"kubernetes.io/projected/32fe88a3-1d3d-4863-9abd-160e0537aff3-kube-api-access-m4qnl\") pod \"keystone-db-create-2mg9r\" (UID: \"32fe88a3-1d3d-4863-9abd-160e0537aff3\") " pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.245332 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da10206b-ede8-4373-af66-b442633908ab-operator-scripts\") pod \"keystone-df14-account-create-update-nfhsc\" (UID: \"da10206b-ede8-4373-af66-b442633908ab\") " pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.245399 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwlwc\" (UniqueName: \"kubernetes.io/projected/da10206b-ede8-4373-af66-b442633908ab-kube-api-access-cwlwc\") pod \"keystone-df14-account-create-update-nfhsc\" (UID: \"da10206b-ede8-4373-af66-b442633908ab\") " pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.246548 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32fe88a3-1d3d-4863-9abd-160e0537aff3-operator-scripts\") pod \"keystone-db-create-2mg9r\" (UID: \"32fe88a3-1d3d-4863-9abd-160e0537aff3\") " pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.249038 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" event={"ID":"c2558371-6693-4c25-bb36-d1688f299d44","Type":"ContainerStarted","Data":"a996f0fdf6d35b1e1919b2b3b7005eb12ace9501af190c83ebfe74d1194b9a5b"} Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.283141 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qnl\" (UniqueName: \"kubernetes.io/projected/32fe88a3-1d3d-4863-9abd-160e0537aff3-kube-api-access-m4qnl\") pod \"keystone-db-create-2mg9r\" (UID: \"32fe88a3-1d3d-4863-9abd-160e0537aff3\") " pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.347042 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwlwc\" (UniqueName: \"kubernetes.io/projected/da10206b-ede8-4373-af66-b442633908ab-kube-api-access-cwlwc\") pod \"keystone-df14-account-create-update-nfhsc\" (UID: \"da10206b-ede8-4373-af66-b442633908ab\") " pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.347226 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da10206b-ede8-4373-af66-b442633908ab-operator-scripts\") pod \"keystone-df14-account-create-update-nfhsc\" (UID: \"da10206b-ede8-4373-af66-b442633908ab\") " pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.347979 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da10206b-ede8-4373-af66-b442633908ab-operator-scripts\") pod \"keystone-df14-account-create-update-nfhsc\" (UID: \"da10206b-ede8-4373-af66-b442633908ab\") " pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.362624 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwlwc\" (UniqueName: \"kubernetes.io/projected/da10206b-ede8-4373-af66-b442633908ab-kube-api-access-cwlwc\") pod \"keystone-df14-account-create-update-nfhsc\" (UID: \"da10206b-ede8-4373-af66-b442633908ab\") " pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.374577 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.474374 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.848069 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2mg9r"] Mar 20 15:44:02 crc kubenswrapper[4779]: I0320 15:44:02.936816 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-df14-account-create-update-nfhsc"] Mar 20 15:44:02 crc kubenswrapper[4779]: W0320 15:44:02.950099 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda10206b_ede8_4373_af66_b442633908ab.slice/crio-9b918adcff749f22efa489da66199fb19ad18a538928f4921423d8f72f9e5770 WatchSource:0}: Error finding container 9b918adcff749f22efa489da66199fb19ad18a538928f4921423d8f72f9e5770: Status 404 returned error can't find the container with id 9b918adcff749f22efa489da66199fb19ad18a538928f4921423d8f72f9e5770 Mar 20 15:44:03 crc kubenswrapper[4779]: I0320 15:44:03.262151 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2mg9r" event={"ID":"32fe88a3-1d3d-4863-9abd-160e0537aff3","Type":"ContainerStarted","Data":"e10fdb1ae97f49627d87219cc2d4d55509955185030eebc461451bfba3b1de77"} Mar 20 15:44:03 crc kubenswrapper[4779]: I0320 15:44:03.262207 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2mg9r" event={"ID":"32fe88a3-1d3d-4863-9abd-160e0537aff3","Type":"ContainerStarted","Data":"589d89bc39a65e3ab76b7baa4b5bf6edd21ac99ea1c431e325f62b0465a83253"} Mar 20 15:44:03 crc kubenswrapper[4779]: I0320 15:44:03.264576 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df14-account-create-update-nfhsc" event={"ID":"da10206b-ede8-4373-af66-b442633908ab","Type":"ContainerStarted","Data":"9b918adcff749f22efa489da66199fb19ad18a538928f4921423d8f72f9e5770"} Mar 20 15:44:03 crc kubenswrapper[4779]: I0320 15:44:03.266048 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" event={"ID":"c2558371-6693-4c25-bb36-d1688f299d44","Type":"ContainerStarted","Data":"bcef4752595fdef6bceac9367420fcfcfa7be7ceb5bae95350caf122f82a4121"} Mar 20 15:44:03 crc kubenswrapper[4779]: I0320 15:44:03.269016 4779 generic.go:334] "Generic (PLEG): container finished" podID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" containerID="3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc" exitCode=0 Mar 20 15:44:03 crc kubenswrapper[4779]: I0320 15:44:03.269048 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74af0a5-e3c3-4569-bece-db2e25e9b79d","Type":"ContainerDied","Data":"3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc"} Mar 20 15:44:03 crc kubenswrapper[4779]: I0320 15:44:03.275753 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-2mg9r" podStartSLOduration=1.275735348 podStartE2EDuration="1.275735348s" podCreationTimestamp="2026-03-20 15:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:03.274021126 +0000 UTC m=+1260.236536926" watchObservedRunningTime="2026-03-20 15:44:03.275735348 +0000 UTC m=+1260.238251148" Mar 20 15:44:03 crc kubenswrapper[4779]: I0320 15:44:03.289806 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" podStartSLOduration=2.364348916 podStartE2EDuration="3.289787517s" podCreationTimestamp="2026-03-20 15:44:00 +0000 UTC" firstStartedPulling="2026-03-20 15:44:01.873365079 +0000 UTC m=+1258.835880879" lastFinishedPulling="2026-03-20 15:44:02.79880368 +0000 UTC m=+1259.761319480" observedRunningTime="2026-03-20 15:44:03.284638929 +0000 UTC m=+1260.247154729" watchObservedRunningTime="2026-03-20 15:44:03.289787517 +0000 UTC m=+1260.252303317" Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.302567 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"7f118687ae236119dd5fe4e907a7925579e393ce48a6073f08644a7c9e22cea7"} Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.303829 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"731ada6d795e0f3eb873e645d33c8dbea73bab8f5ce716e8eb3f394b90d8eae1"} Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.318481 4779 generic.go:334] "Generic (PLEG): container finished" podID="c2558371-6693-4c25-bb36-d1688f299d44" containerID="bcef4752595fdef6bceac9367420fcfcfa7be7ceb5bae95350caf122f82a4121" exitCode=0 Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.318658 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" event={"ID":"c2558371-6693-4c25-bb36-d1688f299d44","Type":"ContainerDied","Data":"bcef4752595fdef6bceac9367420fcfcfa7be7ceb5bae95350caf122f82a4121"} Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.336760 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74af0a5-e3c3-4569-bece-db2e25e9b79d","Type":"ContainerStarted","Data":"b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465"} Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.337587 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.353485 4779 generic.go:334] "Generic (PLEG): container finished" podID="32fe88a3-1d3d-4863-9abd-160e0537aff3" containerID="e10fdb1ae97f49627d87219cc2d4d55509955185030eebc461451bfba3b1de77" exitCode=0 Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.353559 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2mg9r" event={"ID":"32fe88a3-1d3d-4863-9abd-160e0537aff3","Type":"ContainerDied","Data":"e10fdb1ae97f49627d87219cc2d4d55509955185030eebc461451bfba3b1de77"} Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.361383 4779 generic.go:334] "Generic (PLEG): container finished" podID="da10206b-ede8-4373-af66-b442633908ab" containerID="7d8cf03e162fb94ddfac7f060b641bb685784c40f3d4d99d58f7aba8e0cbd6e6" exitCode=0 Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.361433 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df14-account-create-update-nfhsc" event={"ID":"da10206b-ede8-4373-af66-b442633908ab","Type":"ContainerDied","Data":"7d8cf03e162fb94ddfac7f060b641bb685784c40f3d4d99d58f7aba8e0cbd6e6"} Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.363297 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=68.38808435 podStartE2EDuration="1m28.363278395s" podCreationTimestamp="2026-03-20 15:42:36 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.508880865 +0000 UTC m=+1203.471396665" lastFinishedPulling="2026-03-20 15:43:26.48407491 +0000 UTC m=+1223.446590710" observedRunningTime="2026-03-20 15:44:04.355732037 +0000 UTC m=+1261.318247847" watchObservedRunningTime="2026-03-20 15:44:04.363278395 +0000 UTC m=+1261.325794215" Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.400049 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vlnhf"] Mar 20 15:44:04 crc kubenswrapper[4779]: I0320 15:44:04.409571 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vlnhf"] Mar 20 15:44:05 crc kubenswrapper[4779]: I0320 15:44:05.373979 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"85e66d5afd6444f577821cb16417f647e7f710f1b4fbb1b1008f3b8b74361f68"} Mar 20 15:44:05 crc kubenswrapper[4779]: I0320 15:44:05.374025 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"e8370df1385b25e608138b2351262a0284e0a52c20ef46299138cdb5aca8c03e"} Mar 20 15:44:05 crc kubenswrapper[4779]: I0320 15:44:05.822754 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807b267f-236d-4816-ac99-b0c6137d3bda" path="/var/lib/kubelet/pods/807b267f-236d-4816-ac99-b0c6137d3bda/volumes" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.547134 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qnd9s"] Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.548208 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.549946 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-pb8jv" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.550611 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.557356 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qnd9s"] Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.631162 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-db-sync-config-data\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.631537 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvzwq\" (UniqueName: \"kubernetes.io/projected/1435ea5c-3bed-456c-8fe3-931542c325a1-kube-api-access-nvzwq\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.631579 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-combined-ca-bundle\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.631606 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-config-data\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.733324 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvzwq\" (UniqueName: \"kubernetes.io/projected/1435ea5c-3bed-456c-8fe3-931542c325a1-kube-api-access-nvzwq\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.733374 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-combined-ca-bundle\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.733407 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-config-data\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.733489 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-db-sync-config-data\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.738936 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-db-sync-config-data\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.741829 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-config-data\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.742500 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-combined-ca-bundle\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.755994 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvzwq\" (UniqueName: \"kubernetes.io/projected/1435ea5c-3bed-456c-8fe3-931542c325a1-kube-api-access-nvzwq\") pod \"glance-db-sync-qnd9s\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.867960 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.900028 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.905653 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.910123 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.937605 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb28g\" (UniqueName: \"kubernetes.io/projected/c2558371-6693-4c25-bb36-d1688f299d44-kube-api-access-tb28g\") pod \"c2558371-6693-4c25-bb36-d1688f299d44\" (UID: \"c2558371-6693-4c25-bb36-d1688f299d44\") " Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.937692 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4qnl\" (UniqueName: \"kubernetes.io/projected/32fe88a3-1d3d-4863-9abd-160e0537aff3-kube-api-access-m4qnl\") pod \"32fe88a3-1d3d-4863-9abd-160e0537aff3\" (UID: \"32fe88a3-1d3d-4863-9abd-160e0537aff3\") " Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.937867 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32fe88a3-1d3d-4863-9abd-160e0537aff3-operator-scripts\") pod \"32fe88a3-1d3d-4863-9abd-160e0537aff3\" (UID: \"32fe88a3-1d3d-4863-9abd-160e0537aff3\") " Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.938753 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fe88a3-1d3d-4863-9abd-160e0537aff3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32fe88a3-1d3d-4863-9abd-160e0537aff3" (UID: "32fe88a3-1d3d-4863-9abd-160e0537aff3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.950307 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2558371-6693-4c25-bb36-d1688f299d44-kube-api-access-tb28g" (OuterVolumeSpecName: "kube-api-access-tb28g") pod "c2558371-6693-4c25-bb36-d1688f299d44" (UID: "c2558371-6693-4c25-bb36-d1688f299d44"). InnerVolumeSpecName "kube-api-access-tb28g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:06 crc kubenswrapper[4779]: I0320 15:44:06.963744 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fe88a3-1d3d-4863-9abd-160e0537aff3-kube-api-access-m4qnl" (OuterVolumeSpecName: "kube-api-access-m4qnl") pod "32fe88a3-1d3d-4863-9abd-160e0537aff3" (UID: "32fe88a3-1d3d-4863-9abd-160e0537aff3"). InnerVolumeSpecName "kube-api-access-m4qnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.039294 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwlwc\" (UniqueName: \"kubernetes.io/projected/da10206b-ede8-4373-af66-b442633908ab-kube-api-access-cwlwc\") pod \"da10206b-ede8-4373-af66-b442633908ab\" (UID: \"da10206b-ede8-4373-af66-b442633908ab\") " Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.039504 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da10206b-ede8-4373-af66-b442633908ab-operator-scripts\") pod \"da10206b-ede8-4373-af66-b442633908ab\" (UID: \"da10206b-ede8-4373-af66-b442633908ab\") " Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.039816 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32fe88a3-1d3d-4863-9abd-160e0537aff3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.039829 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb28g\" (UniqueName: \"kubernetes.io/projected/c2558371-6693-4c25-bb36-d1688f299d44-kube-api-access-tb28g\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.039839 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4qnl\" (UniqueName: \"kubernetes.io/projected/32fe88a3-1d3d-4863-9abd-160e0537aff3-kube-api-access-m4qnl\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.040485 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da10206b-ede8-4373-af66-b442633908ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da10206b-ede8-4373-af66-b442633908ab" (UID: "da10206b-ede8-4373-af66-b442633908ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.047665 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da10206b-ede8-4373-af66-b442633908ab-kube-api-access-cwlwc" (OuterVolumeSpecName: "kube-api-access-cwlwc") pod "da10206b-ede8-4373-af66-b442633908ab" (UID: "da10206b-ede8-4373-af66-b442633908ab"). InnerVolumeSpecName "kube-api-access-cwlwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.141361 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da10206b-ede8-4373-af66-b442633908ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.141695 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwlwc\" (UniqueName: \"kubernetes.io/projected/da10206b-ede8-4373-af66-b442633908ab-kube-api-access-cwlwc\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.390130 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2mg9r" event={"ID":"32fe88a3-1d3d-4863-9abd-160e0537aff3","Type":"ContainerDied","Data":"589d89bc39a65e3ab76b7baa4b5bf6edd21ac99ea1c431e325f62b0465a83253"} Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.390534 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589d89bc39a65e3ab76b7baa4b5bf6edd21ac99ea1c431e325f62b0465a83253" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.390200 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2mg9r" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.394043 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df14-account-create-update-nfhsc" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.394079 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df14-account-create-update-nfhsc" event={"ID":"da10206b-ede8-4373-af66-b442633908ab","Type":"ContainerDied","Data":"9b918adcff749f22efa489da66199fb19ad18a538928f4921423d8f72f9e5770"} Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.394129 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b918adcff749f22efa489da66199fb19ad18a538928f4921423d8f72f9e5770" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.395988 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" event={"ID":"c2558371-6693-4c25-bb36-d1688f299d44","Type":"ContainerDied","Data":"a996f0fdf6d35b1e1919b2b3b7005eb12ace9501af190c83ebfe74d1194b9a5b"} Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.396035 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a996f0fdf6d35b1e1919b2b3b7005eb12ace9501af190c83ebfe74d1194b9a5b" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.396096 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-tqhpt" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.399276 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerStarted","Data":"93abb91ff7ca83cfeb154eb992beae2f93a4290f56ecfa0f32320c1a6307a4c7"} Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.447195 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=24.081838965 podStartE2EDuration="1m24.447174906s" podCreationTimestamp="2026-03-20 15:42:43 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.569924598 +0000 UTC m=+1203.532440398" lastFinishedPulling="2026-03-20 15:44:06.935260539 +0000 UTC m=+1263.897776339" observedRunningTime="2026-03-20 15:44:07.42880891 +0000 UTC m=+1264.391324730" watchObservedRunningTime="2026-03-20 15:44:07.447174906 +0000 UTC m=+1264.409690706" Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.571572 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qnd9s"] Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.968576 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567018-k964v"] Mar 20 15:44:07 crc kubenswrapper[4779]: I0320 15:44:07.977321 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567018-k964v"] Mar 20 15:44:08 crc kubenswrapper[4779]: I0320 15:44:08.412199 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"a28f2c3d23b995f29ea6f91246d562c3e26191a8864ec11c76ddc00e21f98daa"} Mar 20 15:44:08 crc kubenswrapper[4779]: I0320 15:44:08.412244 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"50ee2311e1f956450d7c2a6084c29e251a87f0a02380feb5fad9e3ee67ffb64a"} Mar 20 15:44:08 crc kubenswrapper[4779]: I0320 15:44:08.412255 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"4a8bc98e147814b9d1d3bb858fd149f131d05a94dd688202a6fddb270b3e026f"} Mar 20 15:44:08 crc kubenswrapper[4779]: I0320 15:44:08.413585 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qnd9s" event={"ID":"1435ea5c-3bed-456c-8fe3-931542c325a1","Type":"ContainerStarted","Data":"9b51e294b1d3a1decdf5a639325dd2d9b370c716ec4b7857220a6e473b25960c"} Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.388584 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5v4xs"] Mar 20 15:44:09 crc kubenswrapper[4779]: E0320 15:44:09.389196 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2558371-6693-4c25-bb36-d1688f299d44" containerName="oc" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.389229 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2558371-6693-4c25-bb36-d1688f299d44" containerName="oc" Mar 20 15:44:09 crc kubenswrapper[4779]: E0320 15:44:09.389245 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da10206b-ede8-4373-af66-b442633908ab" containerName="mariadb-account-create-update" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.389251 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="da10206b-ede8-4373-af66-b442633908ab" containerName="mariadb-account-create-update" Mar 20 15:44:09 crc kubenswrapper[4779]: E0320 15:44:09.389259 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fe88a3-1d3d-4863-9abd-160e0537aff3" containerName="mariadb-database-create" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.389265 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fe88a3-1d3d-4863-9abd-160e0537aff3" containerName="mariadb-database-create" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.390356 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2558371-6693-4c25-bb36-d1688f299d44" containerName="oc" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.390381 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fe88a3-1d3d-4863-9abd-160e0537aff3" containerName="mariadb-database-create" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.390396 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="da10206b-ede8-4373-af66-b442633908ab" containerName="mariadb-account-create-update" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.391076 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.403913 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.405054 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5v4xs"] Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.451329 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"0ad6742238ab8942c78e662f0a5a43b0f3720f56855a50da774ebccebeee4f94"} Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.581841 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmcnz\" (UniqueName: \"kubernetes.io/projected/f4435049-2efc-449f-a5af-25747d3b64f0-kube-api-access-zmcnz\") pod \"root-account-create-update-5v4xs\" (UID: \"f4435049-2efc-449f-a5af-25747d3b64f0\") " pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.581997 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4435049-2efc-449f-a5af-25747d3b64f0-operator-scripts\") pod \"root-account-create-update-5v4xs\" (UID: \"f4435049-2efc-449f-a5af-25747d3b64f0\") " pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.683443 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4435049-2efc-449f-a5af-25747d3b64f0-operator-scripts\") pod \"root-account-create-update-5v4xs\" (UID: \"f4435049-2efc-449f-a5af-25747d3b64f0\") " pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.683626 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmcnz\" (UniqueName: \"kubernetes.io/projected/f4435049-2efc-449f-a5af-25747d3b64f0-kube-api-access-zmcnz\") pod \"root-account-create-update-5v4xs\" (UID: \"f4435049-2efc-449f-a5af-25747d3b64f0\") " pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.684274 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4435049-2efc-449f-a5af-25747d3b64f0-operator-scripts\") pod \"root-account-create-update-5v4xs\" (UID: \"f4435049-2efc-449f-a5af-25747d3b64f0\") " pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.707331 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmcnz\" (UniqueName: \"kubernetes.io/projected/f4435049-2efc-449f-a5af-25747d3b64f0-kube-api-access-zmcnz\") pod \"root-account-create-update-5v4xs\" (UID: \"f4435049-2efc-449f-a5af-25747d3b64f0\") " pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.716102 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.717937 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:09 crc kubenswrapper[4779]: I0320 15:44:09.840407 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a49b764a-39a5-4961-aa89-4f99c8b155a4" path="/var/lib/kubelet/pods/a49b764a-39a5-4961-aa89-4f99c8b155a4/volumes" Mar 20 15:44:10 crc kubenswrapper[4779]: I0320 15:44:10.338242 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5v4xs"] Mar 20 15:44:10 crc kubenswrapper[4779]: W0320 15:44:10.348086 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4435049_2efc_449f_a5af_25747d3b64f0.slice/crio-c4f6355b7499c5a758ce1d2c433eaee449bce1939053fc9d11da8f7f3b713fbe WatchSource:0}: Error finding container c4f6355b7499c5a758ce1d2c433eaee449bce1939053fc9d11da8f7f3b713fbe: Status 404 returned error can't find the container with id c4f6355b7499c5a758ce1d2c433eaee449bce1939053fc9d11da8f7f3b713fbe Mar 20 15:44:10 crc kubenswrapper[4779]: I0320 15:44:10.463295 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"53200de7130872da89a0ad1e9dabd12a7f61409fdedb69a715037ca5b39ac763"} Mar 20 15:44:10 crc kubenswrapper[4779]: I0320 15:44:10.464607 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5v4xs" event={"ID":"f4435049-2efc-449f-a5af-25747d3b64f0","Type":"ContainerStarted","Data":"c4f6355b7499c5a758ce1d2c433eaee449bce1939053fc9d11da8f7f3b713fbe"} Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.187795 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s9wt4" podUID="73a3debf-8de1-4321-b383-5bebca909a38" containerName="ovn-controller" probeResult="failure" output=< Mar 20 15:44:11 crc kubenswrapper[4779]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 15:44:11 crc kubenswrapper[4779]: > Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.282427 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.287751 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bvnhn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.309673 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.482349 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"ffcafe37b7638791f25ad6e995142fbb776675c2310da57248ca2389e636c4fd"} Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.482720 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"5e098ad6bebd520aefc4e5ac4929d30c95ce6694b8291a4a5f61dfb99196e00b"} Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.482730 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"9a6f779c18db5c8255280970871f60f699d74755251d783fc32f3cc303503a89"} Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.482738 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"d024d233bc41c989042a2cc2c4989c0506a1f6a4ba9cb102a26c5dee3a56e8c3"} Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.492769 4779 generic.go:334] "Generic (PLEG): container finished" podID="3016debc-9603-417f-8ff1-6fd3934cd17e" containerID="bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5" exitCode=0 Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.492879 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3016debc-9603-417f-8ff1-6fd3934cd17e","Type":"ContainerDied","Data":"bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5"} Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.500321 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4435049-2efc-449f-a5af-25747d3b64f0" containerID="a5d4406fc1a0b7baef16968b2c9c31f567afa6d0ab1979413fd1a429a2a4f9f5" exitCode=0 Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.501100 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5v4xs" event={"ID":"f4435049-2efc-449f-a5af-25747d3b64f0","Type":"ContainerDied","Data":"a5d4406fc1a0b7baef16968b2c9c31f567afa6d0ab1979413fd1a429a2a4f9f5"} Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.527138 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s9wt4-config-rm9vn"] Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.528515 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.533548 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.538708 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s9wt4-config-rm9vn"] Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.729711 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-scripts\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.729846 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.729884 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrx9r\" (UniqueName: \"kubernetes.io/projected/302d42e3-cf16-441d-bf35-567f0bb958b9-kube-api-access-hrx9r\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.729907 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-log-ovn\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.729937 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run-ovn\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.729975 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-additional-scripts\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831402 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-scripts\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831456 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831498 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrx9r\" (UniqueName: \"kubernetes.io/projected/302d42e3-cf16-441d-bf35-567f0bb958b9-kube-api-access-hrx9r\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831525 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-log-ovn\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831587 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run-ovn\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831855 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-log-ovn\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831887 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831953 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run-ovn\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.831997 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-additional-scripts\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.945144 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-additional-scripts\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.946594 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-scripts\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.951311 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrx9r\" (UniqueName: \"kubernetes.io/projected/302d42e3-cf16-441d-bf35-567f0bb958b9-kube-api-access-hrx9r\") pod \"ovn-controller-s9wt4-config-rm9vn\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:11 crc kubenswrapper[4779]: I0320 15:44:11.963867 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.425201 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s9wt4-config-rm9vn"] Mar 20 15:44:12 crc kubenswrapper[4779]: W0320 15:44:12.429783 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod302d42e3_cf16_441d_bf35_567f0bb958b9.slice/crio-bd9f8c41ca474e5f71b0f83c3b49758b5d63efdf6421e1c3b429153eb6733053 WatchSource:0}: Error finding container bd9f8c41ca474e5f71b0f83c3b49758b5d63efdf6421e1c3b429153eb6733053: Status 404 returned error can't find the container with id bd9f8c41ca474e5f71b0f83c3b49758b5d63efdf6421e1c3b429153eb6733053 Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.517021 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s9wt4-config-rm9vn" event={"ID":"302d42e3-cf16-441d-bf35-567f0bb958b9","Type":"ContainerStarted","Data":"bd9f8c41ca474e5f71b0f83c3b49758b5d63efdf6421e1c3b429153eb6733053"} Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.531462 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"676181f0f1c5565db3a37534635dbe4ab3ccbfdba7c828316aed308485205bf7"} Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.531516 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b","Type":"ContainerStarted","Data":"cd7c19c0e4da02ba78763dc1d9aee60723351576f4d92c26b8c3a496264882f6"} Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.535754 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3016debc-9603-417f-8ff1-6fd3934cd17e","Type":"ContainerStarted","Data":"8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817"} Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.536593 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.594462 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.615638784 podStartE2EDuration="29.594437946s" podCreationTimestamp="2026-03-20 15:43:43 +0000 UTC" firstStartedPulling="2026-03-20 15:44:01.968082543 +0000 UTC m=+1258.930598343" lastFinishedPulling="2026-03-20 15:44:09.946881705 +0000 UTC m=+1266.909397505" observedRunningTime="2026-03-20 15:44:12.580860819 +0000 UTC m=+1269.543376619" watchObservedRunningTime="2026-03-20 15:44:12.594437946 +0000 UTC m=+1269.556953746" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.619984 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371940.234818 podStartE2EDuration="1m36.61995835s" podCreationTimestamp="2026-03-20 15:42:36 +0000 UTC" firstStartedPulling="2026-03-20 15:43:06.885749768 +0000 UTC m=+1203.848265568" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:12.611829288 +0000 UTC m=+1269.574345088" watchObservedRunningTime="2026-03-20 15:44:12.61995835 +0000 UTC m=+1269.582474150" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.867992 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.877781 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-cxhjs"] Mar 20 15:44:12 crc kubenswrapper[4779]: E0320 15:44:12.878146 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4435049-2efc-449f-a5af-25747d3b64f0" containerName="mariadb-account-create-update" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.878162 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4435049-2efc-449f-a5af-25747d3b64f0" containerName="mariadb-account-create-update" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.878342 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4435049-2efc-449f-a5af-25747d3b64f0" containerName="mariadb-account-create-update" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.879274 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.883143 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.892495 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-cxhjs"] Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.951090 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4435049-2efc-449f-a5af-25747d3b64f0-operator-scripts\") pod \"f4435049-2efc-449f-a5af-25747d3b64f0\" (UID: \"f4435049-2efc-449f-a5af-25747d3b64f0\") " Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.951253 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmcnz\" (UniqueName: \"kubernetes.io/projected/f4435049-2efc-449f-a5af-25747d3b64f0-kube-api-access-zmcnz\") pod \"f4435049-2efc-449f-a5af-25747d3b64f0\" (UID: \"f4435049-2efc-449f-a5af-25747d3b64f0\") " Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.952045 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4435049-2efc-449f-a5af-25747d3b64f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4435049-2efc-449f-a5af-25747d3b64f0" (UID: "f4435049-2efc-449f-a5af-25747d3b64f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:12 crc kubenswrapper[4779]: I0320 15:44:12.957200 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4435049-2efc-449f-a5af-25747d3b64f0-kube-api-access-zmcnz" (OuterVolumeSpecName: "kube-api-access-zmcnz") pod "f4435049-2efc-449f-a5af-25747d3b64f0" (UID: "f4435049-2efc-449f-a5af-25747d3b64f0"). InnerVolumeSpecName "kube-api-access-zmcnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.053302 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.053358 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vr7n\" (UniqueName: \"kubernetes.io/projected/efac30ce-be74-429c-a7dc-25c46cfbc88e-kube-api-access-4vr7n\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.053417 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.053516 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.053578 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.053633 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-config\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.053703 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4435049-2efc-449f-a5af-25747d3b64f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.053726 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmcnz\" (UniqueName: \"kubernetes.io/projected/f4435049-2efc-449f-a5af-25747d3b64f0-kube-api-access-zmcnz\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.154868 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.154959 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.154994 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-config\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.155070 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.155094 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vr7n\" (UniqueName: \"kubernetes.io/projected/efac30ce-be74-429c-a7dc-25c46cfbc88e-kube-api-access-4vr7n\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.155160 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.156439 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.156561 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-config\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.157791 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.157918 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.157918 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.183890 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vr7n\" (UniqueName: \"kubernetes.io/projected/efac30ce-be74-429c-a7dc-25c46cfbc88e-kube-api-access-4vr7n\") pod \"dnsmasq-dns-77585f5f8c-cxhjs\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.200381 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.545557 4779 generic.go:334] "Generic (PLEG): container finished" podID="302d42e3-cf16-441d-bf35-567f0bb958b9" containerID="1b6689e48f0ff3a3d5ec9f393bb9833ee3462e60f3f553a28aabe9ea0f5ca99a" exitCode=0 Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.545713 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s9wt4-config-rm9vn" event={"ID":"302d42e3-cf16-441d-bf35-567f0bb958b9","Type":"ContainerDied","Data":"1b6689e48f0ff3a3d5ec9f393bb9833ee3462e60f3f553a28aabe9ea0f5ca99a"} Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.553001 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5v4xs" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.553042 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5v4xs" event={"ID":"f4435049-2efc-449f-a5af-25747d3b64f0","Type":"ContainerDied","Data":"c4f6355b7499c5a758ce1d2c433eaee449bce1939053fc9d11da8f7f3b713fbe"} Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.553064 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4f6355b7499c5a758ce1d2c433eaee449bce1939053fc9d11da8f7f3b713fbe" Mar 20 15:44:13 crc kubenswrapper[4779]: I0320 15:44:13.633472 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-cxhjs"] Mar 20 15:44:13 crc kubenswrapper[4779]: W0320 15:44:13.644455 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefac30ce_be74_429c_a7dc_25c46cfbc88e.slice/crio-111b425a733eed56c8a13c0a826fd063cf19d6731be5e6737ab3edb2de72d7ca WatchSource:0}: Error finding container 111b425a733eed56c8a13c0a826fd063cf19d6731be5e6737ab3edb2de72d7ca: Status 404 returned error can't find the container with id 111b425a733eed56c8a13c0a826fd063cf19d6731be5e6737ab3edb2de72d7ca Mar 20 15:44:14 crc kubenswrapper[4779]: I0320 15:44:14.566843 4779 generic.go:334] "Generic (PLEG): container finished" podID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerID="66c16e5a900e30619b326ef4614898c8d0611c6e4424f793d8f2a7b45617b533" exitCode=0 Mar 20 15:44:14 crc kubenswrapper[4779]: I0320 15:44:14.566895 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" event={"ID":"efac30ce-be74-429c-a7dc-25c46cfbc88e","Type":"ContainerDied","Data":"66c16e5a900e30619b326ef4614898c8d0611c6e4424f793d8f2a7b45617b533"} Mar 20 15:44:14 crc kubenswrapper[4779]: I0320 15:44:14.567468 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" event={"ID":"efac30ce-be74-429c-a7dc-25c46cfbc88e","Type":"ContainerStarted","Data":"111b425a733eed56c8a13c0a826fd063cf19d6731be5e6737ab3edb2de72d7ca"} Mar 20 15:44:14 crc kubenswrapper[4779]: I0320 15:44:14.717518 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:14 crc kubenswrapper[4779]: I0320 15:44:14.726221 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:15 crc kubenswrapper[4779]: I0320 15:44:15.577314 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:16 crc kubenswrapper[4779]: I0320 15:44:16.195791 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s9wt4" Mar 20 15:44:17 crc kubenswrapper[4779]: I0320 15:44:17.993773 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:44:17 crc kubenswrapper[4779]: I0320 15:44:17.994770 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="prometheus" containerID="cri-o://856c3e84fd751f08c78702db57717d11dc490e7ce8e32e54d378540c7eefb0b7" gracePeriod=600 Mar 20 15:44:17 crc kubenswrapper[4779]: I0320 15:44:17.994860 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="config-reloader" containerID="cri-o://882d26ce6c8eda7e63bf062c62f771e9ccd1f4cd79ab91d0649e025e119198b5" gracePeriod=600 Mar 20 15:44:17 crc kubenswrapper[4779]: I0320 15:44:17.994874 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="thanos-sidecar" containerID="cri-o://93abb91ff7ca83cfeb154eb992beae2f93a4290f56ecfa0f32320c1a6307a4c7" gracePeriod=600 Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.299179 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.693694 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-4qv4b"] Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.694880 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.701085 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.701532 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-nzgjd" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.717003 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-pk9nl"] Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.718269 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.728691 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-4qv4b"] Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.752296 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pk9nl"] Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.763266 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-combined-ca-bundle\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.763450 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj8c6\" (UniqueName: \"kubernetes.io/projected/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-kube-api-access-vj8c6\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.763518 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-operator-scripts\") pod \"cinder-db-create-pk9nl\" (UID: \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\") " pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.763577 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-db-sync-config-data\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.763616 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-config-data\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.763734 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t4mh\" (UniqueName: \"kubernetes.io/projected/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-kube-api-access-6t4mh\") pod \"cinder-db-create-pk9nl\" (UID: \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\") " pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.865856 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj8c6\" (UniqueName: \"kubernetes.io/projected/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-kube-api-access-vj8c6\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.865918 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-operator-scripts\") pod \"cinder-db-create-pk9nl\" (UID: \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\") " pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.865971 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-db-sync-config-data\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.866003 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-config-data\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.866072 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t4mh\" (UniqueName: \"kubernetes.io/projected/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-kube-api-access-6t4mh\") pod \"cinder-db-create-pk9nl\" (UID: \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\") " pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.866166 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-combined-ca-bundle\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.867353 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-operator-scripts\") pod \"cinder-db-create-pk9nl\" (UID: \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\") " pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.875874 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-config-data\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.876902 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-db-sync-config-data\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.893259 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-combined-ca-bundle\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.904766 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6kdxq"] Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.906034 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.909537 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj8c6\" (UniqueName: \"kubernetes.io/projected/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-kube-api-access-vj8c6\") pod \"watcher-db-sync-4qv4b\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.914765 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t4mh\" (UniqueName: \"kubernetes.io/projected/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-kube-api-access-6t4mh\") pod \"cinder-db-create-pk9nl\" (UID: \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\") " pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.938265 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a00e-account-create-update-5d2t5"] Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.980470 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:18 crc kubenswrapper[4779]: I0320 15:44:18.987860 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.004046 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6kdxq"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.013279 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.036648 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.084729 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvlm\" (UniqueName: \"kubernetes.io/projected/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-kube-api-access-mfvlm\") pod \"barbican-db-create-6kdxq\" (UID: \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\") " pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.084875 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e18c51e-3afa-4453-9a88-ee43ca5e563b-operator-scripts\") pod \"cinder-a00e-account-create-update-5d2t5\" (UID: \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\") " pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.084914 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-operator-scripts\") pod \"barbican-db-create-6kdxq\" (UID: \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\") " pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.084946 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzkp6\" (UniqueName: \"kubernetes.io/projected/6e18c51e-3afa-4453-9a88-ee43ca5e563b-kube-api-access-tzkp6\") pod \"cinder-a00e-account-create-update-5d2t5\" (UID: \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\") " pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.100627 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a00e-account-create-update-5d2t5"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.113703 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-p6dg6"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.115121 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.149063 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-p6dg6"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.156457 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vkj28"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.157783 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.161577 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.161645 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.161932 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lrbdn" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.162078 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.180712 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vkj28"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.186239 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e18c51e-3afa-4453-9a88-ee43ca5e563b-operator-scripts\") pod \"cinder-a00e-account-create-update-5d2t5\" (UID: \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\") " pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.186336 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-operator-scripts\") pod \"barbican-db-create-6kdxq\" (UID: \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\") " pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.186388 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzkp6\" (UniqueName: \"kubernetes.io/projected/6e18c51e-3afa-4453-9a88-ee43ca5e563b-kube-api-access-tzkp6\") pod \"cinder-a00e-account-create-update-5d2t5\" (UID: \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\") " pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.186461 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvlm\" (UniqueName: \"kubernetes.io/projected/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-kube-api-access-mfvlm\") pod \"barbican-db-create-6kdxq\" (UID: \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\") " pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.186946 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e18c51e-3afa-4453-9a88-ee43ca5e563b-operator-scripts\") pod \"cinder-a00e-account-create-update-5d2t5\" (UID: \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\") " pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.187337 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-operator-scripts\") pod \"barbican-db-create-6kdxq\" (UID: \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\") " pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.198466 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a5fd-account-create-update-9c6l6"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.199947 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.202628 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.216144 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzkp6\" (UniqueName: \"kubernetes.io/projected/6e18c51e-3afa-4453-9a88-ee43ca5e563b-kube-api-access-tzkp6\") pod \"cinder-a00e-account-create-update-5d2t5\" (UID: \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\") " pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.216711 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvlm\" (UniqueName: \"kubernetes.io/projected/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-kube-api-access-mfvlm\") pod \"barbican-db-create-6kdxq\" (UID: \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\") " pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.220157 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a5fd-account-create-update-9c6l6"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.287687 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db09a8d-78e6-403a-842d-4c138a9699be-operator-scripts\") pod \"neutron-db-create-p6dg6\" (UID: \"4db09a8d-78e6-403a-842d-4c138a9699be\") " pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.287805 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-config-data\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.287992 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blpk4\" (UniqueName: \"kubernetes.io/projected/f226989b-bb85-41dc-8e16-631a0740361a-kube-api-access-blpk4\") pod \"barbican-a5fd-account-create-update-9c6l6\" (UID: \"f226989b-bb85-41dc-8e16-631a0740361a\") " pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.288092 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thzw5\" (UniqueName: \"kubernetes.io/projected/4db09a8d-78e6-403a-842d-4c138a9699be-kube-api-access-thzw5\") pod \"neutron-db-create-p6dg6\" (UID: \"4db09a8d-78e6-403a-842d-4c138a9699be\") " pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.288238 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-combined-ca-bundle\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.288390 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbdtl\" (UniqueName: \"kubernetes.io/projected/9a288511-2254-442c-a931-bd59dd0a3b29-kube-api-access-zbdtl\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.288431 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f226989b-bb85-41dc-8e16-631a0740361a-operator-scripts\") pod \"barbican-a5fd-account-create-update-9c6l6\" (UID: \"f226989b-bb85-41dc-8e16-631a0740361a\") " pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.300879 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9a19-account-create-update-w5t6w"] Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.302614 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.304189 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.310541 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.310898 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9a19-account-create-update-w5t6w"] Mar 20 15:44:19 crc kubenswrapper[4779]: E0320 15:44:19.329489 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93aeb14_abaa_4cff_81c0_82d579020dc6.slice/crio-856c3e84fd751f08c78702db57717d11dc490e7ce8e32e54d378540c7eefb0b7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93aeb14_abaa_4cff_81c0_82d579020dc6.slice/crio-conmon-882d26ce6c8eda7e63bf062c62f771e9ccd1f4cd79ab91d0649e025e119198b5.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.332878 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.392674 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96pgv\" (UniqueName: \"kubernetes.io/projected/79528236-508c-451f-906a-7ad8b3f638ef-kube-api-access-96pgv\") pod \"neutron-9a19-account-create-update-w5t6w\" (UID: \"79528236-508c-451f-906a-7ad8b3f638ef\") " pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.392771 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blpk4\" (UniqueName: \"kubernetes.io/projected/f226989b-bb85-41dc-8e16-631a0740361a-kube-api-access-blpk4\") pod \"barbican-a5fd-account-create-update-9c6l6\" (UID: \"f226989b-bb85-41dc-8e16-631a0740361a\") " pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.392833 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thzw5\" (UniqueName: \"kubernetes.io/projected/4db09a8d-78e6-403a-842d-4c138a9699be-kube-api-access-thzw5\") pod \"neutron-db-create-p6dg6\" (UID: \"4db09a8d-78e6-403a-842d-4c138a9699be\") " pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.392869 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-combined-ca-bundle\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.392889 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbdtl\" (UniqueName: \"kubernetes.io/projected/9a288511-2254-442c-a931-bd59dd0a3b29-kube-api-access-zbdtl\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.392922 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f226989b-bb85-41dc-8e16-631a0740361a-operator-scripts\") pod \"barbican-a5fd-account-create-update-9c6l6\" (UID: \"f226989b-bb85-41dc-8e16-631a0740361a\") " pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.393049 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db09a8d-78e6-403a-842d-4c138a9699be-operator-scripts\") pod \"neutron-db-create-p6dg6\" (UID: \"4db09a8d-78e6-403a-842d-4c138a9699be\") " pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.393083 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-config-data\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.393134 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79528236-508c-451f-906a-7ad8b3f638ef-operator-scripts\") pod \"neutron-9a19-account-create-update-w5t6w\" (UID: \"79528236-508c-451f-906a-7ad8b3f638ef\") " pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.394761 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f226989b-bb85-41dc-8e16-631a0740361a-operator-scripts\") pod \"barbican-a5fd-account-create-update-9c6l6\" (UID: \"f226989b-bb85-41dc-8e16-631a0740361a\") " pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.397746 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db09a8d-78e6-403a-842d-4c138a9699be-operator-scripts\") pod \"neutron-db-create-p6dg6\" (UID: \"4db09a8d-78e6-403a-842d-4c138a9699be\") " pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.411330 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-config-data\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.411805 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-combined-ca-bundle\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.415641 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blpk4\" (UniqueName: \"kubernetes.io/projected/f226989b-bb85-41dc-8e16-631a0740361a-kube-api-access-blpk4\") pod \"barbican-a5fd-account-create-update-9c6l6\" (UID: \"f226989b-bb85-41dc-8e16-631a0740361a\") " pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.415731 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbdtl\" (UniqueName: \"kubernetes.io/projected/9a288511-2254-442c-a931-bd59dd0a3b29-kube-api-access-zbdtl\") pod \"keystone-db-sync-vkj28\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.424784 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thzw5\" (UniqueName: \"kubernetes.io/projected/4db09a8d-78e6-403a-842d-4c138a9699be-kube-api-access-thzw5\") pod \"neutron-db-create-p6dg6\" (UID: \"4db09a8d-78e6-403a-842d-4c138a9699be\") " pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.484631 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.495575 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.496181 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79528236-508c-451f-906a-7ad8b3f638ef-operator-scripts\") pod \"neutron-9a19-account-create-update-w5t6w\" (UID: \"79528236-508c-451f-906a-7ad8b3f638ef\") " pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.496248 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96pgv\" (UniqueName: \"kubernetes.io/projected/79528236-508c-451f-906a-7ad8b3f638ef-kube-api-access-96pgv\") pod \"neutron-9a19-account-create-update-w5t6w\" (UID: \"79528236-508c-451f-906a-7ad8b3f638ef\") " pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.497759 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79528236-508c-451f-906a-7ad8b3f638ef-operator-scripts\") pod \"neutron-9a19-account-create-update-w5t6w\" (UID: \"79528236-508c-451f-906a-7ad8b3f638ef\") " pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.514809 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96pgv\" (UniqueName: \"kubernetes.io/projected/79528236-508c-451f-906a-7ad8b3f638ef-kube-api-access-96pgv\") pod \"neutron-9a19-account-create-update-w5t6w\" (UID: \"79528236-508c-451f-906a-7ad8b3f638ef\") " pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.587469 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.612688 4779 generic.go:334] "Generic (PLEG): container finished" podID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerID="93abb91ff7ca83cfeb154eb992beae2f93a4290f56ecfa0f32320c1a6307a4c7" exitCode=0 Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.612718 4779 generic.go:334] "Generic (PLEG): container finished" podID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerID="882d26ce6c8eda7e63bf062c62f771e9ccd1f4cd79ab91d0649e025e119198b5" exitCode=0 Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.612725 4779 generic.go:334] "Generic (PLEG): container finished" podID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerID="856c3e84fd751f08c78702db57717d11dc490e7ce8e32e54d378540c7eefb0b7" exitCode=0 Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.612752 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerDied","Data":"93abb91ff7ca83cfeb154eb992beae2f93a4290f56ecfa0f32320c1a6307a4c7"} Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.612775 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerDied","Data":"882d26ce6c8eda7e63bf062c62f771e9ccd1f4cd79ab91d0649e025e119198b5"} Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.612786 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerDied","Data":"856c3e84fd751f08c78702db57717d11dc490e7ce8e32e54d378540c7eefb0b7"} Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.636603 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:19 crc kubenswrapper[4779]: I0320 15:44:19.717801 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.112:9090/-/ready\": dial tcp 10.217.0.112:9090: connect: connection refused" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.599511 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.640816 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s9wt4-config-rm9vn" event={"ID":"302d42e3-cf16-441d-bf35-567f0bb958b9","Type":"ContainerDied","Data":"bd9f8c41ca474e5f71b0f83c3b49758b5d63efdf6421e1c3b429153eb6733053"} Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.640854 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd9f8c41ca474e5f71b0f83c3b49758b5d63efdf6421e1c3b429153eb6733053" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.640914 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s9wt4-config-rm9vn" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768196 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run\") pod \"302d42e3-cf16-441d-bf35-567f0bb958b9\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768263 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run-ovn\") pod \"302d42e3-cf16-441d-bf35-567f0bb958b9\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768306 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-additional-scripts\") pod \"302d42e3-cf16-441d-bf35-567f0bb958b9\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768421 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-log-ovn\") pod \"302d42e3-cf16-441d-bf35-567f0bb958b9\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768616 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run" (OuterVolumeSpecName: "var-run") pod "302d42e3-cf16-441d-bf35-567f0bb958b9" (UID: "302d42e3-cf16-441d-bf35-567f0bb958b9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768696 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "302d42e3-cf16-441d-bf35-567f0bb958b9" (UID: "302d42e3-cf16-441d-bf35-567f0bb958b9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768754 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "302d42e3-cf16-441d-bf35-567f0bb958b9" (UID: "302d42e3-cf16-441d-bf35-567f0bb958b9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768810 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-scripts\") pod \"302d42e3-cf16-441d-bf35-567f0bb958b9\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.768847 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrx9r\" (UniqueName: \"kubernetes.io/projected/302d42e3-cf16-441d-bf35-567f0bb958b9-kube-api-access-hrx9r\") pod \"302d42e3-cf16-441d-bf35-567f0bb958b9\" (UID: \"302d42e3-cf16-441d-bf35-567f0bb958b9\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.769430 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "302d42e3-cf16-441d-bf35-567f0bb958b9" (UID: "302d42e3-cf16-441d-bf35-567f0bb958b9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.769867 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-scripts" (OuterVolumeSpecName: "scripts") pod "302d42e3-cf16-441d-bf35-567f0bb958b9" (UID: "302d42e3-cf16-441d-bf35-567f0bb958b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.770394 4779 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.770413 4779 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.770422 4779 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.770432 4779 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/302d42e3-cf16-441d-bf35-567f0bb958b9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.770440 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/302d42e3-cf16-441d-bf35-567f0bb958b9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.772916 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302d42e3-cf16-441d-bf35-567f0bb958b9-kube-api-access-hrx9r" (OuterVolumeSpecName: "kube-api-access-hrx9r") pod "302d42e3-cf16-441d-bf35-567f0bb958b9" (UID: "302d42e3-cf16-441d-bf35-567f0bb958b9"). InnerVolumeSpecName "kube-api-access-hrx9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.841570 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.890070 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrx9r\" (UniqueName: \"kubernetes.io/projected/302d42e3-cf16-441d-bf35-567f0bb958b9-kube-api-access-hrx9r\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.991749 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-thanos-prometheus-http-client-file\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.991805 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cg4r\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-kube-api-access-2cg4r\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.991846 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-1\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.991898 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e93aeb14-abaa-4cff-81c0-82d579020dc6-config-out\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.991964 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-2\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.991992 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-0\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.992013 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-tls-assets\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.992191 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.992278 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-config\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.992298 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-web-config\") pod \"e93aeb14-abaa-4cff-81c0-82d579020dc6\" (UID: \"e93aeb14-abaa-4cff-81c0-82d579020dc6\") " Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.992720 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.992732 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:22 crc kubenswrapper[4779]: I0320 15:44:22.993373 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.006909 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-kube-api-access-2cg4r" (OuterVolumeSpecName: "kube-api-access-2cg4r") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "kube-api-access-2cg4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.011666 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.012574 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93aeb14-abaa-4cff-81c0-82d579020dc6-config-out" (OuterVolumeSpecName: "config-out") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.014707 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-config" (OuterVolumeSpecName: "config") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.015039 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.036310 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-web-config" (OuterVolumeSpecName: "web-config") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.046784 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e93aeb14-abaa-4cff-81c0-82d579020dc6" (UID: "e93aeb14-abaa-4cff-81c0-82d579020dc6"). InnerVolumeSpecName "pvc-7954f494-7bea-46f8-a58f-4de62431c2b8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095763 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") on node \"crc\" " Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095794 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095805 4779 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-web-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095831 4779 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e93aeb14-abaa-4cff-81c0-82d579020dc6-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095843 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cg4r\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-kube-api-access-2cg4r\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095855 4779 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095864 4779 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e93aeb14-abaa-4cff-81c0-82d579020dc6-config-out\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095873 4779 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095883 4779 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e93aeb14-abaa-4cff-81c0-82d579020dc6-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.095908 4779 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e93aeb14-abaa-4cff-81c0-82d579020dc6-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.116187 4779 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.116311 4779 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7954f494-7bea-46f8-a58f-4de62431c2b8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8") on node "crc" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.197013 4779 reconciler_common.go:293] "Volume detached for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.488613 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a00e-account-create-update-5d2t5"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.496067 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6kdxq"] Mar 20 15:44:23 crc kubenswrapper[4779]: W0320 15:44:23.502823 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e18c51e_3afa_4453_9a88_ee43ca5e563b.slice/crio-89b8d35cbc58ee3f47a134797a1438a1259f5ae16e5391c6f49376605b3a04f6 WatchSource:0}: Error finding container 89b8d35cbc58ee3f47a134797a1438a1259f5ae16e5391c6f49376605b3a04f6: Status 404 returned error can't find the container with id 89b8d35cbc58ee3f47a134797a1438a1259f5ae16e5391c6f49376605b3a04f6 Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.504147 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a5fd-account-create-update-9c6l6"] Mar 20 15:44:23 crc kubenswrapper[4779]: W0320 15:44:23.504278 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf226989b_bb85_41dc_8e16_631a0740361a.slice/crio-e030ab3d99dc9a02f56adc9877e453bb641d52b899788f70d064154c909bb3ee WatchSource:0}: Error finding container e030ab3d99dc9a02f56adc9877e453bb641d52b899788f70d064154c909bb3ee: Status 404 returned error can't find the container with id e030ab3d99dc9a02f56adc9877e453bb641d52b899788f70d064154c909bb3ee Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.596786 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-p6dg6"] Mar 20 15:44:23 crc kubenswrapper[4779]: W0320 15:44:23.600681 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4db09a8d_78e6_403a_842d_4c138a9699be.slice/crio-c986913e873ee93a95e50f4d7f68a1048f5bfea381d93c227d0dfab5c29dbaca WatchSource:0}: Error finding container c986913e873ee93a95e50f4d7f68a1048f5bfea381d93c227d0dfab5c29dbaca: Status 404 returned error can't find the container with id c986913e873ee93a95e50f4d7f68a1048f5bfea381d93c227d0dfab5c29dbaca Mar 20 15:44:23 crc kubenswrapper[4779]: W0320 15:44:23.602709 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a288511_2254_442c_a931_bd59dd0a3b29.slice/crio-b52c5677b79ff52b29b795203f2ed8d8403d4ed69dbc3b7942dd2ecf26aa6d13 WatchSource:0}: Error finding container b52c5677b79ff52b29b795203f2ed8d8403d4ed69dbc3b7942dd2ecf26aa6d13: Status 404 returned error can't find the container with id b52c5677b79ff52b29b795203f2ed8d8403d4ed69dbc3b7942dd2ecf26aa6d13 Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.607833 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vkj28"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.663960 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e93aeb14-abaa-4cff-81c0-82d579020dc6","Type":"ContainerDied","Data":"999146a747f98d6e3aa4905c0625210b77738d902368284ef54fc90d1a0c2a6a"} Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.664013 4779 scope.go:117] "RemoveContainer" containerID="93abb91ff7ca83cfeb154eb992beae2f93a4290f56ecfa0f32320c1a6307a4c7" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.664185 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.673241 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5fd-account-create-update-9c6l6" event={"ID":"f226989b-bb85-41dc-8e16-631a0740361a","Type":"ContainerStarted","Data":"e030ab3d99dc9a02f56adc9877e453bb641d52b899788f70d064154c909bb3ee"} Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.675962 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a00e-account-create-update-5d2t5" event={"ID":"6e18c51e-3afa-4453-9a88-ee43ca5e563b","Type":"ContainerStarted","Data":"89b8d35cbc58ee3f47a134797a1438a1259f5ae16e5391c6f49376605b3a04f6"} Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.677172 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-p6dg6" event={"ID":"4db09a8d-78e6-403a-842d-4c138a9699be","Type":"ContainerStarted","Data":"c986913e873ee93a95e50f4d7f68a1048f5bfea381d93c227d0dfab5c29dbaca"} Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.680739 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6kdxq" event={"ID":"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0","Type":"ContainerStarted","Data":"5eba178f9255c99560f7242663dc4602706ab83192db6fec40b9fe3ee17b8686"} Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.688271 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" event={"ID":"efac30ce-be74-429c-a7dc-25c46cfbc88e","Type":"ContainerStarted","Data":"a86073d489d2cc92b4899bdc62d5e012f582ac9a3c5bd9bec6e9f396d3100381"} Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.689223 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.689676 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s9wt4-config-rm9vn"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.693234 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vkj28" event={"ID":"9a288511-2254-442c-a931-bd59dd0a3b29","Type":"ContainerStarted","Data":"b52c5677b79ff52b29b795203f2ed8d8403d4ed69dbc3b7942dd2ecf26aa6d13"} Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.697613 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s9wt4-config-rm9vn"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.723173 4779 scope.go:117] "RemoveContainer" containerID="882d26ce6c8eda7e63bf062c62f771e9ccd1f4cd79ab91d0649e025e119198b5" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.724483 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" podStartSLOduration=11.724472734 podStartE2EDuration="11.724472734s" podCreationTimestamp="2026-03-20 15:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:23.714888536 +0000 UTC m=+1280.677404326" watchObservedRunningTime="2026-03-20 15:44:23.724472734 +0000 UTC m=+1280.686988534" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.761675 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-4qv4b"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.785385 4779 scope.go:117] "RemoveContainer" containerID="856c3e84fd751f08c78702db57717d11dc490e7ce8e32e54d378540c7eefb0b7" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.801297 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.848806 4779 scope.go:117] "RemoveContainer" containerID="8e4730bb63686853f3e300c75d90cb2b290843398e8244bebc62b19ff8802e60" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.865695 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302d42e3-cf16-441d-bf35-567f0bb958b9" path="/var/lib/kubelet/pods/302d42e3-cf16-441d-bf35-567f0bb958b9/volumes" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866513 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866551 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9a19-account-create-update-w5t6w"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866566 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pk9nl"] Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866579 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:44:23 crc kubenswrapper[4779]: E0320 15:44:23.866892 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="prometheus" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866905 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="prometheus" Mar 20 15:44:23 crc kubenswrapper[4779]: E0320 15:44:23.866917 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="config-reloader" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866925 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="config-reloader" Mar 20 15:44:23 crc kubenswrapper[4779]: E0320 15:44:23.866943 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302d42e3-cf16-441d-bf35-567f0bb958b9" containerName="ovn-config" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866950 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="302d42e3-cf16-441d-bf35-567f0bb958b9" containerName="ovn-config" Mar 20 15:44:23 crc kubenswrapper[4779]: E0320 15:44:23.866963 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="thanos-sidecar" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866970 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="thanos-sidecar" Mar 20 15:44:23 crc kubenswrapper[4779]: E0320 15:44:23.866988 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="init-config-reloader" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.866996 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="init-config-reloader" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.867203 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="prometheus" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.867218 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="config-reloader" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.867225 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="302d42e3-cf16-441d-bf35-567f0bb958b9" containerName="ovn-config" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.867236 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" containerName="thanos-sidecar" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.869198 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.872101 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.872372 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.872386 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.872641 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.878518 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.878529 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.878671 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.879869 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.880026 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-cfgwn" Mar 20 15:44:23 crc kubenswrapper[4779]: I0320 15:44:23.885862 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017297 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017364 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-config\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017400 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017484 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017522 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017546 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/301642e3-d84f-41af-8795-8042fdccdade-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017589 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017625 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017666 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017702 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017738 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017762 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.017786 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmg7r\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-kube-api-access-pmg7r\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119313 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-config\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119364 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119418 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119445 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119481 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/301642e3-d84f-41af-8795-8042fdccdade-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119524 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119558 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119589 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119613 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119645 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119665 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119683 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmg7r\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-kube-api-access-pmg7r\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.119747 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.121842 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.123354 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.124029 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.126015 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/301642e3-d84f-41af-8795-8042fdccdade-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.126256 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.126750 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.126761 4779 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.126798 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/63343ced4decb75fd431545090989bf5e441d5fa06a6828c5f39e243f0a750bf/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.127636 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-config\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.128603 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.133154 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.134263 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.138857 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.143811 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmg7r\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-kube-api-access-pmg7r\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.172193 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.197605 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.703000 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.706797 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-4qv4b" event={"ID":"8d89bfb9-3cdc-4e93-899a-361f5d5cf408","Type":"ContainerStarted","Data":"8ce7ce2b52de27f567f1db2d0a5ed4eb4bd230d2691261adb46f17dad603abfb"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.708572 4779 generic.go:334] "Generic (PLEG): container finished" podID="6e18c51e-3afa-4453-9a88-ee43ca5e563b" containerID="241be15361d72400eb521684e1b9f80a0ae6daa87c1b1052b4f55e30968532e6" exitCode=0 Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.708645 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a00e-account-create-update-5d2t5" event={"ID":"6e18c51e-3afa-4453-9a88-ee43ca5e563b","Type":"ContainerDied","Data":"241be15361d72400eb521684e1b9f80a0ae6daa87c1b1052b4f55e30968532e6"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.710263 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qnd9s" event={"ID":"1435ea5c-3bed-456c-8fe3-931542c325a1","Type":"ContainerStarted","Data":"d8bd85d7a12fc6efa263fdfde508100ec19852d584988eb60e2807bbfd9dc50d"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.713351 4779 generic.go:334] "Generic (PLEG): container finished" podID="4db09a8d-78e6-403a-842d-4c138a9699be" containerID="752adf4cc33c5cf8f247ab3a7b196dbcf69f41be57776bf574f6b2db924d31fe" exitCode=0 Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.713416 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-p6dg6" event={"ID":"4db09a8d-78e6-403a-842d-4c138a9699be","Type":"ContainerDied","Data":"752adf4cc33c5cf8f247ab3a7b196dbcf69f41be57776bf574f6b2db924d31fe"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.715009 4779 generic.go:334] "Generic (PLEG): container finished" podID="79528236-508c-451f-906a-7ad8b3f638ef" containerID="c4b96bc0c679406f602c994fa757d3f3a131ad1870068a502ee5e00895b647c1" exitCode=0 Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.715054 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9a19-account-create-update-w5t6w" event={"ID":"79528236-508c-451f-906a-7ad8b3f638ef","Type":"ContainerDied","Data":"c4b96bc0c679406f602c994fa757d3f3a131ad1870068a502ee5e00895b647c1"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.715074 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9a19-account-create-update-w5t6w" event={"ID":"79528236-508c-451f-906a-7ad8b3f638ef","Type":"ContainerStarted","Data":"32b7fce755fd174903209df325c57343b7c441cd371c06ca703e5ab83da2d20a"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.717546 4779 generic.go:334] "Generic (PLEG): container finished" podID="f226989b-bb85-41dc-8e16-631a0740361a" containerID="e6afca2124db1facd084e1f9c742aa482a672c0acc52b5110edf62421d80d853" exitCode=0 Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.717673 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5fd-account-create-update-9c6l6" event={"ID":"f226989b-bb85-41dc-8e16-631a0740361a","Type":"ContainerDied","Data":"e6afca2124db1facd084e1f9c742aa482a672c0acc52b5110edf62421d80d853"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.719650 4779 generic.go:334] "Generic (PLEG): container finished" podID="3e10831a-7f1b-4066-bcf0-0c08ad7b16c0" containerID="bc6b113c92083e1c01f30284fff16690d424220807531272f7e6efdcdd3d445f" exitCode=0 Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.719726 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6kdxq" event={"ID":"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0","Type":"ContainerDied","Data":"bc6b113c92083e1c01f30284fff16690d424220807531272f7e6efdcdd3d445f"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.725754 4779 generic.go:334] "Generic (PLEG): container finished" podID="7de7ca4b-8d4c-41be-bd83-f1430eefc89f" containerID="b1c5e24756628eae2850f3fcd18bbb4aa44322d860bbcfa746337a7efdaae919" exitCode=0 Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.725851 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pk9nl" event={"ID":"7de7ca4b-8d4c-41be-bd83-f1430eefc89f","Type":"ContainerDied","Data":"b1c5e24756628eae2850f3fcd18bbb4aa44322d860bbcfa746337a7efdaae919"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.726045 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pk9nl" event={"ID":"7de7ca4b-8d4c-41be-bd83-f1430eefc89f","Type":"ContainerStarted","Data":"0561c67029997f7173e974dbe845a321cf541f999c7ee3499d395e7253ff79a0"} Mar 20 15:44:24 crc kubenswrapper[4779]: I0320 15:44:24.790894 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qnd9s" podStartSLOduration=3.739644405 podStartE2EDuration="18.790874625s" podCreationTimestamp="2026-03-20 15:44:06 +0000 UTC" firstStartedPulling="2026-03-20 15:44:07.837236346 +0000 UTC m=+1264.799752146" lastFinishedPulling="2026-03-20 15:44:22.888466566 +0000 UTC m=+1279.850982366" observedRunningTime="2026-03-20 15:44:24.788561148 +0000 UTC m=+1281.751076938" watchObservedRunningTime="2026-03-20 15:44:24.790874625 +0000 UTC m=+1281.753390425" Mar 20 15:44:25 crc kubenswrapper[4779]: W0320 15:44:25.400977 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301642e3_d84f_41af_8795_8042fdccdade.slice/crio-8d54f66b79af85a56279f9b1b164a89d76b6c16ae323a30871027a8629bdc3b3 WatchSource:0}: Error finding container 8d54f66b79af85a56279f9b1b164a89d76b6c16ae323a30871027a8629bdc3b3: Status 404 returned error can't find the container with id 8d54f66b79af85a56279f9b1b164a89d76b6c16ae323a30871027a8629bdc3b3 Mar 20 15:44:25 crc kubenswrapper[4779]: I0320 15:44:25.735547 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerStarted","Data":"8d54f66b79af85a56279f9b1b164a89d76b6c16ae323a30871027a8629bdc3b3"} Mar 20 15:44:25 crc kubenswrapper[4779]: I0320 15:44:25.853136 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93aeb14-abaa-4cff-81c0-82d579020dc6" path="/var/lib/kubelet/pods/e93aeb14-abaa-4cff-81c0-82d579020dc6/volumes" Mar 20 15:44:26 crc kubenswrapper[4779]: I0320 15:44:26.221568 4779 scope.go:117] "RemoveContainer" containerID="1ce0cc7d8c7f15a8b28a4cfc94a8459144a84fabf49f45997b3386ddd8e21045" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.005136 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.207315 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.262293 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-hmz4x"] Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.262543 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-hmz4x" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerName="dnsmasq-dns" containerID="cri-o://91c11bb54a8057aea0a9ab54904ee35bce3c21141f649fc9f72262c2d26d65b7" gracePeriod=10 Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.764475 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5fd-account-create-update-9c6l6" event={"ID":"f226989b-bb85-41dc-8e16-631a0740361a","Type":"ContainerDied","Data":"e030ab3d99dc9a02f56adc9877e453bb641d52b899788f70d064154c909bb3ee"} Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.764724 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e030ab3d99dc9a02f56adc9877e453bb641d52b899788f70d064154c909bb3ee" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.765695 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9a19-account-create-update-w5t6w" event={"ID":"79528236-508c-451f-906a-7ad8b3f638ef","Type":"ContainerDied","Data":"32b7fce755fd174903209df325c57343b7c441cd371c06ca703e5ab83da2d20a"} Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.765717 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32b7fce755fd174903209df325c57343b7c441cd371c06ca703e5ab83da2d20a" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.766859 4779 generic.go:334] "Generic (PLEG): container finished" podID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerID="91c11bb54a8057aea0a9ab54904ee35bce3c21141f649fc9f72262c2d26d65b7" exitCode=0 Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.766882 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-hmz4x" event={"ID":"1c14b0cd-bac7-4ca6-b176-1c49747e9132","Type":"ContainerDied","Data":"91c11bb54a8057aea0a9ab54904ee35bce3c21141f649fc9f72262c2d26d65b7"} Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.835769 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.844044 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.939057 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79528236-508c-451f-906a-7ad8b3f638ef-operator-scripts\") pod \"79528236-508c-451f-906a-7ad8b3f638ef\" (UID: \"79528236-508c-451f-906a-7ad8b3f638ef\") " Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.939260 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f226989b-bb85-41dc-8e16-631a0740361a-operator-scripts\") pod \"f226989b-bb85-41dc-8e16-631a0740361a\" (UID: \"f226989b-bb85-41dc-8e16-631a0740361a\") " Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.939313 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blpk4\" (UniqueName: \"kubernetes.io/projected/f226989b-bb85-41dc-8e16-631a0740361a-kube-api-access-blpk4\") pod \"f226989b-bb85-41dc-8e16-631a0740361a\" (UID: \"f226989b-bb85-41dc-8e16-631a0740361a\") " Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.939646 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96pgv\" (UniqueName: \"kubernetes.io/projected/79528236-508c-451f-906a-7ad8b3f638ef-kube-api-access-96pgv\") pod \"79528236-508c-451f-906a-7ad8b3f638ef\" (UID: \"79528236-508c-451f-906a-7ad8b3f638ef\") " Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.939818 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79528236-508c-451f-906a-7ad8b3f638ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79528236-508c-451f-906a-7ad8b3f638ef" (UID: "79528236-508c-451f-906a-7ad8b3f638ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.939948 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f226989b-bb85-41dc-8e16-631a0740361a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f226989b-bb85-41dc-8e16-631a0740361a" (UID: "f226989b-bb85-41dc-8e16-631a0740361a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.940141 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f226989b-bb85-41dc-8e16-631a0740361a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.940160 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79528236-508c-451f-906a-7ad8b3f638ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.944582 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f226989b-bb85-41dc-8e16-631a0740361a-kube-api-access-blpk4" (OuterVolumeSpecName: "kube-api-access-blpk4") pod "f226989b-bb85-41dc-8e16-631a0740361a" (UID: "f226989b-bb85-41dc-8e16-631a0740361a"). InnerVolumeSpecName "kube-api-access-blpk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:28 crc kubenswrapper[4779]: I0320 15:44:28.969762 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79528236-508c-451f-906a-7ad8b3f638ef-kube-api-access-96pgv" (OuterVolumeSpecName: "kube-api-access-96pgv") pod "79528236-508c-451f-906a-7ad8b3f638ef" (UID: "79528236-508c-451f-906a-7ad8b3f638ef"). InnerVolumeSpecName "kube-api-access-96pgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:29 crc kubenswrapper[4779]: I0320 15:44:29.041853 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blpk4\" (UniqueName: \"kubernetes.io/projected/f226989b-bb85-41dc-8e16-631a0740361a-kube-api-access-blpk4\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:29 crc kubenswrapper[4779]: I0320 15:44:29.042844 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96pgv\" (UniqueName: \"kubernetes.io/projected/79528236-508c-451f-906a-7ad8b3f638ef-kube-api-access-96pgv\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:29 crc kubenswrapper[4779]: I0320 15:44:29.656650 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-hmz4x" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Mar 20 15:44:29 crc kubenswrapper[4779]: I0320 15:44:29.774728 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5fd-account-create-update-9c6l6" Mar 20 15:44:29 crc kubenswrapper[4779]: I0320 15:44:29.774764 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9a19-account-create-update-w5t6w" Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.791332 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pk9nl" event={"ID":"7de7ca4b-8d4c-41be-bd83-f1430eefc89f","Type":"ContainerDied","Data":"0561c67029997f7173e974dbe845a321cf541f999c7ee3499d395e7253ff79a0"} Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.791692 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0561c67029997f7173e974dbe845a321cf541f999c7ee3499d395e7253ff79a0" Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.815923 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.887037 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-operator-scripts\") pod \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\" (UID: \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\") " Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.887130 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t4mh\" (UniqueName: \"kubernetes.io/projected/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-kube-api-access-6t4mh\") pod \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\" (UID: \"7de7ca4b-8d4c-41be-bd83-f1430eefc89f\") " Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.889374 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7de7ca4b-8d4c-41be-bd83-f1430eefc89f" (UID: "7de7ca4b-8d4c-41be-bd83-f1430eefc89f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.894344 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-kube-api-access-6t4mh" (OuterVolumeSpecName: "kube-api-access-6t4mh") pod "7de7ca4b-8d4c-41be-bd83-f1430eefc89f" (UID: "7de7ca4b-8d4c-41be-bd83-f1430eefc89f"). InnerVolumeSpecName "kube-api-access-6t4mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.990098 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:31 crc kubenswrapper[4779]: I0320 15:44:31.990148 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t4mh\" (UniqueName: \"kubernetes.io/projected/7de7ca4b-8d4c-41be-bd83-f1430eefc89f-kube-api-access-6t4mh\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:32 crc kubenswrapper[4779]: I0320 15:44:32.798587 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pk9nl" Mar 20 15:44:34 crc kubenswrapper[4779]: I0320 15:44:34.655696 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-hmz4x" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Mar 20 15:44:34 crc kubenswrapper[4779]: I0320 15:44:34.817013 4779 generic.go:334] "Generic (PLEG): container finished" podID="1435ea5c-3bed-456c-8fe3-931542c325a1" containerID="d8bd85d7a12fc6efa263fdfde508100ec19852d584988eb60e2807bbfd9dc50d" exitCode=0 Mar 20 15:44:34 crc kubenswrapper[4779]: I0320 15:44:34.817056 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qnd9s" event={"ID":"1435ea5c-3bed-456c-8fe3-931542c325a1","Type":"ContainerDied","Data":"d8bd85d7a12fc6efa263fdfde508100ec19852d584988eb60e2807bbfd9dc50d"} Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.242181 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.248945 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.307088 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.407983 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfvlm\" (UniqueName: \"kubernetes.io/projected/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-kube-api-access-mfvlm\") pod \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\" (UID: \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\") " Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.408435 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thzw5\" (UniqueName: \"kubernetes.io/projected/4db09a8d-78e6-403a-842d-4c138a9699be-kube-api-access-thzw5\") pod \"4db09a8d-78e6-403a-842d-4c138a9699be\" (UID: \"4db09a8d-78e6-403a-842d-4c138a9699be\") " Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.408570 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db09a8d-78e6-403a-842d-4c138a9699be-operator-scripts\") pod \"4db09a8d-78e6-403a-842d-4c138a9699be\" (UID: \"4db09a8d-78e6-403a-842d-4c138a9699be\") " Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.408674 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzkp6\" (UniqueName: \"kubernetes.io/projected/6e18c51e-3afa-4453-9a88-ee43ca5e563b-kube-api-access-tzkp6\") pod \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\" (UID: \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\") " Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.409094 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db09a8d-78e6-403a-842d-4c138a9699be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4db09a8d-78e6-403a-842d-4c138a9699be" (UID: "4db09a8d-78e6-403a-842d-4c138a9699be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.409210 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e18c51e-3afa-4453-9a88-ee43ca5e563b-operator-scripts\") pod \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\" (UID: \"6e18c51e-3afa-4453-9a88-ee43ca5e563b\") " Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.409259 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-operator-scripts\") pod \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\" (UID: \"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0\") " Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.409749 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e18c51e-3afa-4453-9a88-ee43ca5e563b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e18c51e-3afa-4453-9a88-ee43ca5e563b" (UID: "6e18c51e-3afa-4453-9a88-ee43ca5e563b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.410160 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e10831a-7f1b-4066-bcf0-0c08ad7b16c0" (UID: "3e10831a-7f1b-4066-bcf0-0c08ad7b16c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.410165 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db09a8d-78e6-403a-842d-4c138a9699be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.415581 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4db09a8d-78e6-403a-842d-4c138a9699be-kube-api-access-thzw5" (OuterVolumeSpecName: "kube-api-access-thzw5") pod "4db09a8d-78e6-403a-842d-4c138a9699be" (UID: "4db09a8d-78e6-403a-842d-4c138a9699be"). InnerVolumeSpecName "kube-api-access-thzw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.421228 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e18c51e-3afa-4453-9a88-ee43ca5e563b-kube-api-access-tzkp6" (OuterVolumeSpecName: "kube-api-access-tzkp6") pod "6e18c51e-3afa-4453-9a88-ee43ca5e563b" (UID: "6e18c51e-3afa-4453-9a88-ee43ca5e563b"). InnerVolumeSpecName "kube-api-access-tzkp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.421347 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-kube-api-access-mfvlm" (OuterVolumeSpecName: "kube-api-access-mfvlm") pod "3e10831a-7f1b-4066-bcf0-0c08ad7b16c0" (UID: "3e10831a-7f1b-4066-bcf0-0c08ad7b16c0"). InnerVolumeSpecName "kube-api-access-mfvlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.512068 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfvlm\" (UniqueName: \"kubernetes.io/projected/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-kube-api-access-mfvlm\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.512099 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thzw5\" (UniqueName: \"kubernetes.io/projected/4db09a8d-78e6-403a-842d-4c138a9699be-kube-api-access-thzw5\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.512120 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzkp6\" (UniqueName: \"kubernetes.io/projected/6e18c51e-3afa-4453-9a88-ee43ca5e563b-kube-api-access-tzkp6\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.512129 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e18c51e-3afa-4453-9a88-ee43ca5e563b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.512138 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:38 crc kubenswrapper[4779]: E0320 15:44:38.809975 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.222:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Mar 20 15:44:38 crc kubenswrapper[4779]: E0320 15:44:38.810027 4779 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.222:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Mar 20 15:44:38 crc kubenswrapper[4779]: E0320 15:44:38.810172 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:38.102.83.222:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vj8c6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-4qv4b_openstack(8d89bfb9-3cdc-4e93-899a-361f5d5cf408): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:44:38 crc kubenswrapper[4779]: E0320 15:44:38.811679 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-4qv4b" podUID="8d89bfb9-3cdc-4e93-899a-361f5d5cf408" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.851463 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-p6dg6" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.851473 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-p6dg6" event={"ID":"4db09a8d-78e6-403a-842d-4c138a9699be","Type":"ContainerDied","Data":"c986913e873ee93a95e50f4d7f68a1048f5bfea381d93c227d0dfab5c29dbaca"} Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.851670 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c986913e873ee93a95e50f4d7f68a1048f5bfea381d93c227d0dfab5c29dbaca" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.853057 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6kdxq" event={"ID":"3e10831a-7f1b-4066-bcf0-0c08ad7b16c0","Type":"ContainerDied","Data":"5eba178f9255c99560f7242663dc4602706ab83192db6fec40b9fe3ee17b8686"} Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.853083 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eba178f9255c99560f7242663dc4602706ab83192db6fec40b9fe3ee17b8686" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.853142 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6kdxq" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.857540 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-hmz4x" event={"ID":"1c14b0cd-bac7-4ca6-b176-1c49747e9132","Type":"ContainerDied","Data":"78b6da3d5d846d4524106f88568cd004ccfe9ca8434b3c6e0510699132664a43"} Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.857584 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78b6da3d5d846d4524106f88568cd004ccfe9ca8434b3c6e0510699132664a43" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.860914 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a00e-account-create-update-5d2t5" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.860959 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a00e-account-create-update-5d2t5" event={"ID":"6e18c51e-3afa-4453-9a88-ee43ca5e563b","Type":"ContainerDied","Data":"89b8d35cbc58ee3f47a134797a1438a1259f5ae16e5391c6f49376605b3a04f6"} Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.861144 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b8d35cbc58ee3f47a134797a1438a1259f5ae16e5391c6f49376605b3a04f6" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.866055 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qnd9s" event={"ID":"1435ea5c-3bed-456c-8fe3-931542c325a1","Type":"ContainerDied","Data":"9b51e294b1d3a1decdf5a639325dd2d9b370c716ec4b7857220a6e473b25960c"} Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.866082 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b51e294b1d3a1decdf5a639325dd2d9b370c716ec4b7857220a6e473b25960c" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.867351 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:38 crc kubenswrapper[4779]: E0320 15:44:38.877082 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.222:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest\\\"\"" pod="openstack/watcher-db-sync-4qv4b" podUID="8d89bfb9-3cdc-4e93-899a-361f5d5cf408" Mar 20 15:44:38 crc kubenswrapper[4779]: I0320 15:44:38.903845 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022157 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-dns-svc\") pod \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022470 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-sb\") pod \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022637 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-combined-ca-bundle\") pod \"1435ea5c-3bed-456c-8fe3-931542c325a1\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022734 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-config-data\") pod \"1435ea5c-3bed-456c-8fe3-931542c325a1\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022793 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-config\") pod \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022849 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvzwq\" (UniqueName: \"kubernetes.io/projected/1435ea5c-3bed-456c-8fe3-931542c325a1-kube-api-access-nvzwq\") pod \"1435ea5c-3bed-456c-8fe3-931542c325a1\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022889 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-nb\") pod \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022961 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-db-sync-config-data\") pod \"1435ea5c-3bed-456c-8fe3-931542c325a1\" (UID: \"1435ea5c-3bed-456c-8fe3-931542c325a1\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.022988 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99j4g\" (UniqueName: \"kubernetes.io/projected/1c14b0cd-bac7-4ca6-b176-1c49747e9132-kube-api-access-99j4g\") pod \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\" (UID: \"1c14b0cd-bac7-4ca6-b176-1c49747e9132\") " Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.026759 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1435ea5c-3bed-456c-8fe3-931542c325a1" (UID: "1435ea5c-3bed-456c-8fe3-931542c325a1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.027349 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c14b0cd-bac7-4ca6-b176-1c49747e9132-kube-api-access-99j4g" (OuterVolumeSpecName: "kube-api-access-99j4g") pod "1c14b0cd-bac7-4ca6-b176-1c49747e9132" (UID: "1c14b0cd-bac7-4ca6-b176-1c49747e9132"). InnerVolumeSpecName "kube-api-access-99j4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.030748 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1435ea5c-3bed-456c-8fe3-931542c325a1-kube-api-access-nvzwq" (OuterVolumeSpecName: "kube-api-access-nvzwq") pod "1435ea5c-3bed-456c-8fe3-931542c325a1" (UID: "1435ea5c-3bed-456c-8fe3-931542c325a1"). InnerVolumeSpecName "kube-api-access-nvzwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.051663 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1435ea5c-3bed-456c-8fe3-931542c325a1" (UID: "1435ea5c-3bed-456c-8fe3-931542c325a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.063533 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c14b0cd-bac7-4ca6-b176-1c49747e9132" (UID: "1c14b0cd-bac7-4ca6-b176-1c49747e9132"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.068074 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c14b0cd-bac7-4ca6-b176-1c49747e9132" (UID: "1c14b0cd-bac7-4ca6-b176-1c49747e9132"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.071218 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-config" (OuterVolumeSpecName: "config") pod "1c14b0cd-bac7-4ca6-b176-1c49747e9132" (UID: "1c14b0cd-bac7-4ca6-b176-1c49747e9132"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.073175 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c14b0cd-bac7-4ca6-b176-1c49747e9132" (UID: "1c14b0cd-bac7-4ca6-b176-1c49747e9132"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.074860 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-config-data" (OuterVolumeSpecName: "config-data") pod "1435ea5c-3bed-456c-8fe3-931542c325a1" (UID: "1435ea5c-3bed-456c-8fe3-931542c325a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125310 4779 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125343 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99j4g\" (UniqueName: \"kubernetes.io/projected/1c14b0cd-bac7-4ca6-b176-1c49747e9132-kube-api-access-99j4g\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125356 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125365 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125373 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125380 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435ea5c-3bed-456c-8fe3-931542c325a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125389 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125404 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvzwq\" (UniqueName: \"kubernetes.io/projected/1435ea5c-3bed-456c-8fe3-931542c325a1-kube-api-access-nvzwq\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.125415 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c14b0cd-bac7-4ca6-b176-1c49747e9132-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.874895 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vkj28" event={"ID":"9a288511-2254-442c-a931-bd59dd0a3b29","Type":"ContainerStarted","Data":"1d66475bf2a6f89bf954aca5541d1b52ab4aebd06b56aea9ec4f52acf4b5260e"} Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.874921 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-hmz4x" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.875634 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qnd9s" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.921797 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vkj28" podStartSLOduration=5.709717619 podStartE2EDuration="20.921755232s" podCreationTimestamp="2026-03-20 15:44:19 +0000 UTC" firstStartedPulling="2026-03-20 15:44:23.607535658 +0000 UTC m=+1280.570051458" lastFinishedPulling="2026-03-20 15:44:38.819573271 +0000 UTC m=+1295.782089071" observedRunningTime="2026-03-20 15:44:39.893408678 +0000 UTC m=+1296.855924518" watchObservedRunningTime="2026-03-20 15:44:39.921755232 +0000 UTC m=+1296.884271052" Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.933316 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-hmz4x"] Mar 20 15:44:39 crc kubenswrapper[4779]: I0320 15:44:39.941636 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-hmz4x"] Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.182921 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-dnlbg"] Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183362 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1435ea5c-3bed-456c-8fe3-931542c325a1" containerName="glance-db-sync" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183378 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1435ea5c-3bed-456c-8fe3-931542c325a1" containerName="glance-db-sync" Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183392 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerName="init" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183398 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerName="init" Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183412 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerName="dnsmasq-dns" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183418 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerName="dnsmasq-dns" Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183433 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e18c51e-3afa-4453-9a88-ee43ca5e563b" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183441 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e18c51e-3afa-4453-9a88-ee43ca5e563b" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183452 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e10831a-7f1b-4066-bcf0-0c08ad7b16c0" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183459 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e10831a-7f1b-4066-bcf0-0c08ad7b16c0" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183473 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db09a8d-78e6-403a-842d-4c138a9699be" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183480 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db09a8d-78e6-403a-842d-4c138a9699be" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183488 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f226989b-bb85-41dc-8e16-631a0740361a" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183494 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f226989b-bb85-41dc-8e16-631a0740361a" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183505 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de7ca4b-8d4c-41be-bd83-f1430eefc89f" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183511 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de7ca4b-8d4c-41be-bd83-f1430eefc89f" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: E0320 15:44:40.183534 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79528236-508c-451f-906a-7ad8b3f638ef" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183542 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="79528236-508c-451f-906a-7ad8b3f638ef" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183724 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e10831a-7f1b-4066-bcf0-0c08ad7b16c0" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183760 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" containerName="dnsmasq-dns" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183775 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e18c51e-3afa-4453-9a88-ee43ca5e563b" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183788 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="79528236-508c-451f-906a-7ad8b3f638ef" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183811 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de7ca4b-8d4c-41be-bd83-f1430eefc89f" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183830 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f226989b-bb85-41dc-8e16-631a0740361a" containerName="mariadb-account-create-update" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183844 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1435ea5c-3bed-456c-8fe3-931542c325a1" containerName="glance-db-sync" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.183860 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="4db09a8d-78e6-403a-842d-4c138a9699be" containerName="mariadb-database-create" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.185026 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.192167 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-dnlbg"] Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.349375 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-config\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.349440 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.349468 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fpl\" (UniqueName: \"kubernetes.io/projected/04b39283-3b82-427d-a3bc-a2d64232222b-kube-api-access-r2fpl\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.349523 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.349584 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.349732 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.451026 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.451297 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.451443 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-config\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.451589 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.451727 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fpl\" (UniqueName: \"kubernetes.io/projected/04b39283-3b82-427d-a3bc-a2d64232222b-kube-api-access-r2fpl\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.451837 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.452200 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.452326 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-config\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.452326 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.452541 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.452977 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.485229 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fpl\" (UniqueName: \"kubernetes.io/projected/04b39283-3b82-427d-a3bc-a2d64232222b-kube-api-access-r2fpl\") pod \"dnsmasq-dns-7ff5475cc9-dnlbg\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.506205 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:40 crc kubenswrapper[4779]: I0320 15:44:40.968781 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-dnlbg"] Mar 20 15:44:40 crc kubenswrapper[4779]: W0320 15:44:40.978408 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04b39283_3b82_427d_a3bc_a2d64232222b.slice/crio-b0b72ce3c4b82277f97fb3b39608b483d7bd2623ecb8d99a8cd9edf58e4ef9a6 WatchSource:0}: Error finding container b0b72ce3c4b82277f97fb3b39608b483d7bd2623ecb8d99a8cd9edf58e4ef9a6: Status 404 returned error can't find the container with id b0b72ce3c4b82277f97fb3b39608b483d7bd2623ecb8d99a8cd9edf58e4ef9a6 Mar 20 15:44:41 crc kubenswrapper[4779]: I0320 15:44:41.824138 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c14b0cd-bac7-4ca6-b176-1c49747e9132" path="/var/lib/kubelet/pods/1c14b0cd-bac7-4ca6-b176-1c49747e9132/volumes" Mar 20 15:44:41 crc kubenswrapper[4779]: I0320 15:44:41.890791 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerStarted","Data":"9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e"} Mar 20 15:44:41 crc kubenswrapper[4779]: I0320 15:44:41.895931 4779 generic.go:334] "Generic (PLEG): container finished" podID="04b39283-3b82-427d-a3bc-a2d64232222b" containerID="85fff7c1d337408330f6cea731a303662239a71b6a9c4b99277f2c220bd553aa" exitCode=0 Mar 20 15:44:41 crc kubenswrapper[4779]: I0320 15:44:41.895999 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" event={"ID":"04b39283-3b82-427d-a3bc-a2d64232222b","Type":"ContainerDied","Data":"85fff7c1d337408330f6cea731a303662239a71b6a9c4b99277f2c220bd553aa"} Mar 20 15:44:41 crc kubenswrapper[4779]: I0320 15:44:41.896032 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" event={"ID":"04b39283-3b82-427d-a3bc-a2d64232222b","Type":"ContainerStarted","Data":"b0b72ce3c4b82277f97fb3b39608b483d7bd2623ecb8d99a8cd9edf58e4ef9a6"} Mar 20 15:44:42 crc kubenswrapper[4779]: I0320 15:44:42.904418 4779 generic.go:334] "Generic (PLEG): container finished" podID="9a288511-2254-442c-a931-bd59dd0a3b29" containerID="1d66475bf2a6f89bf954aca5541d1b52ab4aebd06b56aea9ec4f52acf4b5260e" exitCode=0 Mar 20 15:44:42 crc kubenswrapper[4779]: I0320 15:44:42.904507 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vkj28" event={"ID":"9a288511-2254-442c-a931-bd59dd0a3b29","Type":"ContainerDied","Data":"1d66475bf2a6f89bf954aca5541d1b52ab4aebd06b56aea9ec4f52acf4b5260e"} Mar 20 15:44:42 crc kubenswrapper[4779]: I0320 15:44:42.907992 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" event={"ID":"04b39283-3b82-427d-a3bc-a2d64232222b","Type":"ContainerStarted","Data":"f831bc5b22ad79cdfde533fc0cc510acf3b74f6dc399c6ff3d08a15b96d254a5"} Mar 20 15:44:42 crc kubenswrapper[4779]: I0320 15:44:42.908028 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:42 crc kubenswrapper[4779]: I0320 15:44:42.945471 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" podStartSLOduration=2.945454092 podStartE2EDuration="2.945454092s" podCreationTimestamp="2026-03-20 15:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:42.936413336 +0000 UTC m=+1299.898929146" watchObservedRunningTime="2026-03-20 15:44:42.945454092 +0000 UTC m=+1299.907969892" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.257407 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.314767 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-combined-ca-bundle\") pod \"9a288511-2254-442c-a931-bd59dd0a3b29\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.314890 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbdtl\" (UniqueName: \"kubernetes.io/projected/9a288511-2254-442c-a931-bd59dd0a3b29-kube-api-access-zbdtl\") pod \"9a288511-2254-442c-a931-bd59dd0a3b29\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.314965 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-config-data\") pod \"9a288511-2254-442c-a931-bd59dd0a3b29\" (UID: \"9a288511-2254-442c-a931-bd59dd0a3b29\") " Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.321494 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a288511-2254-442c-a931-bd59dd0a3b29-kube-api-access-zbdtl" (OuterVolumeSpecName: "kube-api-access-zbdtl") pod "9a288511-2254-442c-a931-bd59dd0a3b29" (UID: "9a288511-2254-442c-a931-bd59dd0a3b29"). InnerVolumeSpecName "kube-api-access-zbdtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.353640 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a288511-2254-442c-a931-bd59dd0a3b29" (UID: "9a288511-2254-442c-a931-bd59dd0a3b29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.379671 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-config-data" (OuterVolumeSpecName: "config-data") pod "9a288511-2254-442c-a931-bd59dd0a3b29" (UID: "9a288511-2254-442c-a931-bd59dd0a3b29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.416781 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.416820 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbdtl\" (UniqueName: \"kubernetes.io/projected/9a288511-2254-442c-a931-bd59dd0a3b29-kube-api-access-zbdtl\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.416835 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a288511-2254-442c-a931-bd59dd0a3b29-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.922189 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vkj28" event={"ID":"9a288511-2254-442c-a931-bd59dd0a3b29","Type":"ContainerDied","Data":"b52c5677b79ff52b29b795203f2ed8d8403d4ed69dbc3b7942dd2ecf26aa6d13"} Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.922236 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b52c5677b79ff52b29b795203f2ed8d8403d4ed69dbc3b7942dd2ecf26aa6d13" Mar 20 15:44:44 crc kubenswrapper[4779]: I0320 15:44:44.922243 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vkj28" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.087215 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-dnlbg"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.087670 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" podUID="04b39283-3b82-427d-a3bc-a2d64232222b" containerName="dnsmasq-dns" containerID="cri-o://f831bc5b22ad79cdfde533fc0cc510acf3b74f6dc399c6ff3d08a15b96d254a5" gracePeriod=10 Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.129526 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jh4bv"] Mar 20 15:44:45 crc kubenswrapper[4779]: E0320 15:44:45.130088 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a288511-2254-442c-a931-bd59dd0a3b29" containerName="keystone-db-sync" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.130125 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a288511-2254-442c-a931-bd59dd0a3b29" containerName="keystone-db-sync" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.130413 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a288511-2254-442c-a931-bd59dd0a3b29" containerName="keystone-db-sync" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.131311 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.135857 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.136095 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.136335 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lrbdn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.139279 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.140741 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.158260 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.160007 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.173964 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jh4bv"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.230325 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44q8p\" (UniqueName: \"kubernetes.io/projected/a04e9147-27c2-436d-8225-14693475e475-kube-api-access-44q8p\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.230549 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvs5\" (UniqueName: \"kubernetes.io/projected/9a20155b-6ff9-4250-8b0c-38b41c21c67c-kube-api-access-hfvs5\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231141 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231231 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-scripts\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231359 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-credential-keys\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231535 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231615 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-fernet-keys\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231707 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-config\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231777 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231851 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-config-data\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.231928 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.232004 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-combined-ca-bundle\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.234654 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.347238 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wjcm6"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.348345 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352138 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352176 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-fernet-keys\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352203 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-config\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352224 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352246 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-config-data\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352262 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352283 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-combined-ca-bundle\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352308 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44q8p\" (UniqueName: \"kubernetes.io/projected/a04e9147-27c2-436d-8225-14693475e475-kube-api-access-44q8p\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352333 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvs5\" (UniqueName: \"kubernetes.io/projected/9a20155b-6ff9-4250-8b0c-38b41c21c67c-kube-api-access-hfvs5\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352349 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352365 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-scripts\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.352394 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-credential-keys\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.356898 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-c9kbg" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.356917 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.357643 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.360249 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-config\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.364345 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.372326 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.373866 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.376252 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.378165 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-combined-ca-bundle\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.388814 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-fernet-keys\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.393635 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-credential-keys\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.394034 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-scripts\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.407493 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wjcm6"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.419249 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvs5\" (UniqueName: \"kubernetes.io/projected/9a20155b-6ff9-4250-8b0c-38b41c21c67c-kube-api-access-hfvs5\") pod \"dnsmasq-dns-5c5cc7c5ff-mfj9r\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.419867 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-config-data\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.424084 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44q8p\" (UniqueName: \"kubernetes.io/projected/a04e9147-27c2-436d-8225-14693475e475-kube-api-access-44q8p\") pod \"keystone-bootstrap-jh4bv\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.459482 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/130a37da-c17e-48dd-8712-f87c67f01852-etc-machine-id\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.459535 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-db-sync-config-data\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.459564 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p64mw\" (UniqueName: \"kubernetes.io/projected/130a37da-c17e-48dd-8712-f87c67f01852-kube-api-access-p64mw\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.459590 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-combined-ca-bundle\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.459632 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-scripts\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.459677 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-config-data\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.473151 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-689d9485bf-d7qg8"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.474704 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.481998 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-689d9485bf-d7qg8"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.484658 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.484815 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.484919 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.485017 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-g7pfz" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.495490 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.502382 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.504459 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.517608 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.517836 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.518915 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-gwmgm"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.520095 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.542315 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.542534 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ggdr6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.542774 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.542913 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gwmgm"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.574261 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576163 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-config-data\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576208 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-config-data\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576243 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b61924e-c7b3-4017-9e77-9a170fdcda23-logs\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576265 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576296 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-scripts\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576326 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-scripts\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576362 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/130a37da-c17e-48dd-8712-f87c67f01852-etc-machine-id\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576384 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-run-httpd\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576403 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-db-sync-config-data\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576419 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-log-httpd\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576434 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576455 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-combined-ca-bundle\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576479 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p64mw\" (UniqueName: \"kubernetes.io/projected/130a37da-c17e-48dd-8712-f87c67f01852-kube-api-access-p64mw\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576497 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjtr\" (UniqueName: \"kubernetes.io/projected/2caab026-6b85-4015-9190-29d8ef94a58f-kube-api-access-qqjtr\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576515 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b61924e-c7b3-4017-9e77-9a170fdcda23-horizon-secret-key\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576540 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-combined-ca-bundle\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576559 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv57h\" (UniqueName: \"kubernetes.io/projected/d35f1456-b579-484e-bbb4-75ced186fdda-kube-api-access-vv57h\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576593 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qwtp\" (UniqueName: \"kubernetes.io/projected/0b61924e-c7b3-4017-9e77-9a170fdcda23-kube-api-access-6qwtp\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576608 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-config-data\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576624 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-config\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.576645 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-scripts\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.582606 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/130a37da-c17e-48dd-8712-f87c67f01852-etc-machine-id\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.586006 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.589301 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-db-sync-config-data\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.607190 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-config-data\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.607551 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-scripts\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.630304 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-combined-ca-bundle\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.633862 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p64mw\" (UniqueName: \"kubernetes.io/projected/130a37da-c17e-48dd-8712-f87c67f01852-kube-api-access-p64mw\") pod \"cinder-db-sync-wjcm6\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.662653 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xjstn"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.663819 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.671526 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.671723 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-s67ld" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.677823 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qwtp\" (UniqueName: \"kubernetes.io/projected/0b61924e-c7b3-4017-9e77-9a170fdcda23-kube-api-access-6qwtp\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.677878 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-config-data\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.677906 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-config\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.677953 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-config-data\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.677992 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b61924e-c7b3-4017-9e77-9a170fdcda23-logs\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.678013 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.678039 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-scripts\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.678074 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-scripts\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.679335 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-run-httpd\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.679365 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-log-httpd\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.679382 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.679406 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-combined-ca-bundle\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.679445 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjtr\" (UniqueName: \"kubernetes.io/projected/2caab026-6b85-4015-9190-29d8ef94a58f-kube-api-access-qqjtr\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.679464 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b61924e-c7b3-4017-9e77-9a170fdcda23-horizon-secret-key\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.679578 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv57h\" (UniqueName: \"kubernetes.io/projected/d35f1456-b579-484e-bbb4-75ced186fdda-kube-api-access-vv57h\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.680100 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b61924e-c7b3-4017-9e77-9a170fdcda23-logs\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.680830 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-config-data\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.681215 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-log-httpd\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.681488 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-run-httpd\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.681631 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-scripts\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.686183 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.689762 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-config-data\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.695333 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.697736 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-combined-ca-bundle\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.702688 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-config\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.712256 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b61924e-c7b3-4017-9e77-9a170fdcda23-horizon-secret-key\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.712315 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mx4bn"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.712902 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-scripts\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.716292 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjtr\" (UniqueName: \"kubernetes.io/projected/2caab026-6b85-4015-9190-29d8ef94a58f-kube-api-access-qqjtr\") pod \"ceilometer-0\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.716719 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.723425 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-92cdc" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.723700 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.725488 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.730509 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv57h\" (UniqueName: \"kubernetes.io/projected/d35f1456-b579-484e-bbb4-75ced186fdda-kube-api-access-vv57h\") pod \"neutron-db-sync-gwmgm\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.730575 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xjstn"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.737838 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qwtp\" (UniqueName: \"kubernetes.io/projected/0b61924e-c7b3-4017-9e77-9a170fdcda23-kube-api-access-6qwtp\") pod \"horizon-689d9485bf-d7qg8\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.773274 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mx4bn"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.784876 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.787049 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-db-sync-config-data\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.787151 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-combined-ca-bundle\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.787187 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbnsv\" (UniqueName: \"kubernetes.io/projected/b67fdca5-13d1-4e83-8834-03856ca956b5-kube-api-access-gbnsv\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.787228 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bkh8\" (UniqueName: \"kubernetes.io/projected/85c5c01b-85a0-4714-81fe-31192c87b2fa-kube-api-access-5bkh8\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.787338 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-combined-ca-bundle\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.787396 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-config-data\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.787439 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c5c01b-85a0-4714-81fe-31192c87b2fa-logs\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.787606 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-scripts\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.810252 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.811828 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.812184 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.818675 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-pb8jv" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.818883 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.819025 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.820520 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.838359 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-854bf9d4d7-f9lw8"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.839817 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.863683 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-854bf9d4d7-f9lw8"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889031 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-logs\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889309 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-846nm\" (UniqueName: \"kubernetes.io/projected/74512812-a6c7-40b6-bf73-155ed352ed3d-kube-api-access-846nm\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889335 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-scripts\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889382 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889401 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-db-sync-config-data\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889421 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-config-data\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889438 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-combined-ca-bundle\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889458 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889486 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbnsv\" (UniqueName: \"kubernetes.io/projected/b67fdca5-13d1-4e83-8834-03856ca956b5-kube-api-access-gbnsv\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889505 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bkh8\" (UniqueName: \"kubernetes.io/projected/85c5c01b-85a0-4714-81fe-31192c87b2fa-kube-api-access-5bkh8\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889652 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74512812-a6c7-40b6-bf73-155ed352ed3d-logs\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889677 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889707 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pph8g\" (UniqueName: \"kubernetes.io/projected/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-kube-api-access-pph8g\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889724 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74512812-a6c7-40b6-bf73-155ed352ed3d-horizon-secret-key\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889743 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889758 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889776 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-combined-ca-bundle\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889794 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-config-data\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889815 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-scripts\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889841 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.889867 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c5c01b-85a0-4714-81fe-31192c87b2fa-logs\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.897634 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c5c01b-85a0-4714-81fe-31192c87b2fa-logs\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.905935 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-scripts\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.906550 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-db-sync-config-data\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.906602 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-combined-ca-bundle\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.906937 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-config-data\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.913850 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-combined-ca-bundle\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.915681 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-k84c5"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.917729 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.924776 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bkh8\" (UniqueName: \"kubernetes.io/projected/85c5c01b-85a0-4714-81fe-31192c87b2fa-kube-api-access-5bkh8\") pod \"placement-db-sync-mx4bn\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.936547 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.948030 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.953801 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbnsv\" (UniqueName: \"kubernetes.io/projected/b67fdca5-13d1-4e83-8834-03856ca956b5-kube-api-access-gbnsv\") pod \"barbican-db-sync-xjstn\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.969070 4779 generic.go:334] "Generic (PLEG): container finished" podID="04b39283-3b82-427d-a3bc-a2d64232222b" containerID="f831bc5b22ad79cdfde533fc0cc510acf3b74f6dc399c6ff3d08a15b96d254a5" exitCode=0 Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.969141 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" event={"ID":"04b39283-3b82-427d-a3bc-a2d64232222b","Type":"ContainerDied","Data":"f831bc5b22ad79cdfde533fc0cc510acf3b74f6dc399c6ff3d08a15b96d254a5"} Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.981568 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991647 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991703 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991739 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-config\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991758 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991810 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74512812-a6c7-40b6-bf73-155ed352ed3d-logs\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991838 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991866 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pph8g\" (UniqueName: \"kubernetes.io/projected/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-kube-api-access-pph8g\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991884 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74512812-a6c7-40b6-bf73-155ed352ed3d-horizon-secret-key\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991901 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991920 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991937 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991953 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991976 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-scripts\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.991995 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.992045 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-logs\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.992064 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-846nm\" (UniqueName: \"kubernetes.io/projected/74512812-a6c7-40b6-bf73-155ed352ed3d-kube-api-access-846nm\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.992080 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l52j\" (UniqueName: \"kubernetes.io/projected/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-kube-api-access-8l52j\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.992161 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.992199 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-config-data\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.993374 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-config-data\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.994204 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74512812-a6c7-40b6-bf73-155ed352ed3d-logs\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.994599 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.994724 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-logs\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.995013 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-scripts\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.995439 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.996415 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:44:45 crc kubenswrapper[4779]: I0320 15:44:45.999243 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.000782 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.009494 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74512812-a6c7-40b6-bf73-155ed352ed3d-horizon-secret-key\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.011934 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.014215 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.016559 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-846nm\" (UniqueName: \"kubernetes.io/projected/74512812-a6c7-40b6-bf73-155ed352ed3d-kube-api-access-846nm\") pod \"horizon-854bf9d4d7-f9lw8\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.025249 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xjstn" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.029072 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pph8g\" (UniqueName: \"kubernetes.io/projected/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-kube-api-access-pph8g\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.063764 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mx4bn" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.067309 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.069908 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-k84c5"] Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.099852 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.099921 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-config\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.099938 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.099995 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.100010 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.100072 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l52j\" (UniqueName: \"kubernetes.io/projected/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-kube-api-access-8l52j\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.101358 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.102795 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.102852 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.102987 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.103387 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.105962 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-config\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.106747 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.106980 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.111608 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.111774 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.132783 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l52j\" (UniqueName: \"kubernetes.io/projected/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-kube-api-access-8l52j\") pod \"dnsmasq-dns-8b5c85b87-k84c5\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.132887 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.173318 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.198854 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.264349 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.303566 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-svc\") pod \"04b39283-3b82-427d-a3bc-a2d64232222b\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.303744 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-swift-storage-0\") pod \"04b39283-3b82-427d-a3bc-a2d64232222b\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.303804 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-sb\") pod \"04b39283-3b82-427d-a3bc-a2d64232222b\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.303854 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2fpl\" (UniqueName: \"kubernetes.io/projected/04b39283-3b82-427d-a3bc-a2d64232222b-kube-api-access-r2fpl\") pod \"04b39283-3b82-427d-a3bc-a2d64232222b\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.303963 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-nb\") pod \"04b39283-3b82-427d-a3bc-a2d64232222b\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.304011 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-config\") pod \"04b39283-3b82-427d-a3bc-a2d64232222b\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.304744 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.304805 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.304871 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.306860 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vplfz\" (UniqueName: \"kubernetes.io/projected/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-kube-api-access-vplfz\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.307541 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.307678 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.307794 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.307821 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.329023 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b39283-3b82-427d-a3bc-a2d64232222b-kube-api-access-r2fpl" (OuterVolumeSpecName: "kube-api-access-r2fpl") pod "04b39283-3b82-427d-a3bc-a2d64232222b" (UID: "04b39283-3b82-427d-a3bc-a2d64232222b"). InnerVolumeSpecName "kube-api-access-r2fpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.382186 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04b39283-3b82-427d-a3bc-a2d64232222b" (UID: "04b39283-3b82-427d-a3bc-a2d64232222b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.382255 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04b39283-3b82-427d-a3bc-a2d64232222b" (UID: "04b39283-3b82-427d-a3bc-a2d64232222b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.413498 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.414355 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.414439 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vplfz\" (UniqueName: \"kubernetes.io/projected/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-kube-api-access-vplfz\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.414529 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.414666 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.414775 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.414841 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.415019 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.415195 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.415261 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2fpl\" (UniqueName: \"kubernetes.io/projected/04b39283-3b82-427d-a3bc-a2d64232222b-kube-api-access-r2fpl\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.415319 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.416205 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.418206 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.414084 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.422022 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.427679 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.436791 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.445746 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.462466 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vplfz\" (UniqueName: \"kubernetes.io/projected/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-kube-api-access-vplfz\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.475158 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-config" (OuterVolumeSpecName: "config") pod "04b39283-3b82-427d-a3bc-a2d64232222b" (UID: "04b39283-3b82-427d-a3bc-a2d64232222b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.523203 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.523866 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04b39283-3b82-427d-a3bc-a2d64232222b" (UID: "04b39283-3b82-427d-a3bc-a2d64232222b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:46 crc kubenswrapper[4779]: W0320 15:44:46.528850 4779 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/04b39283-3b82-427d-a3bc-a2d64232222b/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.528880 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04b39283-3b82-427d-a3bc-a2d64232222b" (UID: "04b39283-3b82-427d-a3bc-a2d64232222b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.528677 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-nb\") pod \"04b39283-3b82-427d-a3bc-a2d64232222b\" (UID: \"04b39283-3b82-427d-a3bc-a2d64232222b\") " Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.541469 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.541504 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.546296 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jh4bv"] Mar 20 15:44:46 crc kubenswrapper[4779]: W0320 15:44:46.612373 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda04e9147_27c2_436d_8225_14693475e475.slice/crio-90cef75bddb9defece4af0ab6e79550b4cbf096189e3d74bc41e78ba7730d8dc WatchSource:0}: Error finding container 90cef75bddb9defece4af0ab6e79550b4cbf096189e3d74bc41e78ba7730d8dc: Status 404 returned error can't find the container with id 90cef75bddb9defece4af0ab6e79550b4cbf096189e3d74bc41e78ba7730d8dc Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.613007 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04b39283-3b82-427d-a3bc-a2d64232222b" (UID: "04b39283-3b82-427d-a3bc-a2d64232222b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.627631 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r"] Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.642668 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04b39283-3b82-427d-a3bc-a2d64232222b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.648631 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wjcm6"] Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.762918 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.791777 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-689d9485bf-d7qg8"] Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.968853 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:44:46 crc kubenswrapper[4779]: I0320 15:44:46.999794 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" event={"ID":"9a20155b-6ff9-4250-8b0c-38b41c21c67c","Type":"ContainerStarted","Data":"f8539e521bd80e9533d303d30e49861f81ddb2b803ae27fd1bfca12d1fa85de0"} Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.006782 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jh4bv" event={"ID":"a04e9147-27c2-436d-8225-14693475e475","Type":"ContainerStarted","Data":"a85465ebf729ed3710c438a3eba2e7951d3eff1d01d3855caf6846980f2c8f14"} Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.006829 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jh4bv" event={"ID":"a04e9147-27c2-436d-8225-14693475e475","Type":"ContainerStarted","Data":"90cef75bddb9defece4af0ab6e79550b4cbf096189e3d74bc41e78ba7730d8dc"} Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.012365 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wjcm6" event={"ID":"130a37da-c17e-48dd-8712-f87c67f01852","Type":"ContainerStarted","Data":"cc54b62ea5e44d7f19be551b756bd3f21fa59e99f887e3b037197c9ad779fff9"} Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.013967 4779 generic.go:334] "Generic (PLEG): container finished" podID="301642e3-d84f-41af-8795-8042fdccdade" containerID="9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e" exitCode=0 Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.014016 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerDied","Data":"9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e"} Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.031813 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerStarted","Data":"d652bcb93b77a03aaf8ec73ff8d1dcfaa11075be78c5411f7a73ed46a3c5ec36"} Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.035888 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jh4bv" podStartSLOduration=2.035859177 podStartE2EDuration="2.035859177s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:47.028866478 +0000 UTC m=+1303.991382278" watchObservedRunningTime="2026-03-20 15:44:47.035859177 +0000 UTC m=+1303.998374987" Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.038468 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" event={"ID":"04b39283-3b82-427d-a3bc-a2d64232222b","Type":"ContainerDied","Data":"b0b72ce3c4b82277f97fb3b39608b483d7bd2623ecb8d99a8cd9edf58e4ef9a6"} Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.038510 4779 scope.go:117] "RemoveContainer" containerID="f831bc5b22ad79cdfde533fc0cc510acf3b74f6dc399c6ff3d08a15b96d254a5" Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.038642 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-dnlbg" Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.055832 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689d9485bf-d7qg8" event={"ID":"0b61924e-c7b3-4017-9e77-9a170fdcda23","Type":"ContainerStarted","Data":"0c1049f2b5ad642cb8de19a123af93af7614b7ddb66db680342c78ef26931fc5"} Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.109280 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-dnlbg"] Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.116647 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-dnlbg"] Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.124354 4779 scope.go:117] "RemoveContainer" containerID="85fff7c1d337408330f6cea731a303662239a71b6a9c4b99277f2c220bd553aa" Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.160274 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gwmgm"] Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.471357 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xjstn"] Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.481271 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-k84c5"] Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.498437 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-854bf9d4d7-f9lw8"] Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.521427 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mx4bn"] Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.732774 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.900148 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b39283-3b82-427d-a3bc-a2d64232222b" path="/var/lib/kubelet/pods/04b39283-3b82-427d-a3bc-a2d64232222b/volumes" Mar 20 15:44:47 crc kubenswrapper[4779]: I0320 15:44:47.901033 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.069333 4779 generic.go:334] "Generic (PLEG): container finished" podID="9a20155b-6ff9-4250-8b0c-38b41c21c67c" containerID="61293940300d0b90f8a67f9082389497dedb719f7c51241f5b40480b9f3be79d" exitCode=0 Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.069387 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" event={"ID":"9a20155b-6ff9-4250-8b0c-38b41c21c67c","Type":"ContainerDied","Data":"61293940300d0b90f8a67f9082389497dedb719f7c51241f5b40480b9f3be79d"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.073786 4779 generic.go:334] "Generic (PLEG): container finished" podID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerID="30ae351dc4aeb444ed855167bedf714620624ff370ae1485d118ea8aca01b668" exitCode=0 Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.073837 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" event={"ID":"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc","Type":"ContainerDied","Data":"30ae351dc4aeb444ed855167bedf714620624ff370ae1485d118ea8aca01b668"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.073858 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" event={"ID":"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc","Type":"ContainerStarted","Data":"8be4690c0c877c1f0990e88ddae184c36e65d429e152b5c1d354c0dd4d2a1812"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.077236 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gwmgm" event={"ID":"d35f1456-b579-484e-bbb4-75ced186fdda","Type":"ContainerStarted","Data":"fcd9f636bf9ef211a785824ec24e9742f165fa40de9f5c35b3b32d8950658468"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.077273 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gwmgm" event={"ID":"d35f1456-b579-484e-bbb4-75ced186fdda","Type":"ContainerStarted","Data":"3b211920ff0d6885e44b95c090b4422e89d3fc3050d95c133a5810dbd4fd0bd0"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.078027 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xjstn" event={"ID":"b67fdca5-13d1-4e83-8834-03856ca956b5","Type":"ContainerStarted","Data":"d3c9384254e248e60dfe47156d02895d622f7314c32a7c5a76225cca1d8419c6"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.079571 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerStarted","Data":"cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.080603 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mx4bn" event={"ID":"85c5c01b-85a0-4714-81fe-31192c87b2fa","Type":"ContainerStarted","Data":"4c2d2afadba931988575afbb166a9557a6715ebdbc7c9fc6327aef3cdaac6a3b"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.089626 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1f5c56d-a2c4-429c-8eae-5d6b3644b297","Type":"ContainerStarted","Data":"38bb05cb7a17ad0763c921755d8ad3221d4a0c900e6faa2e6fba104e7c4adb35"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.094633 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-854bf9d4d7-f9lw8" event={"ID":"74512812-a6c7-40b6-bf73-155ed352ed3d","Type":"ContainerStarted","Data":"a45c1d984f55dc164a5900c8a8cb34b303153b2ffb8ec4c0869cae3cae577ecc"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.097995 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112","Type":"ContainerStarted","Data":"c52acf87901d8eb85ec2cb096b528ac1911f2efc9fa965954acf73ef0f92a86a"} Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.123616 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-gwmgm" podStartSLOduration=3.123598604 podStartE2EDuration="3.123598604s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:48.122494639 +0000 UTC m=+1305.085010439" watchObservedRunningTime="2026-03-20 15:44:48.123598604 +0000 UTC m=+1305.086114404" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.298013 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.334008 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-689d9485bf-d7qg8"] Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.393447 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55df45f4c7-62whh"] Mar 20 15:44:48 crc kubenswrapper[4779]: E0320 15:44:48.393880 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b39283-3b82-427d-a3bc-a2d64232222b" containerName="dnsmasq-dns" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.393894 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b39283-3b82-427d-a3bc-a2d64232222b" containerName="dnsmasq-dns" Mar 20 15:44:48 crc kubenswrapper[4779]: E0320 15:44:48.393912 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b39283-3b82-427d-a3bc-a2d64232222b" containerName="init" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.393920 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b39283-3b82-427d-a3bc-a2d64232222b" containerName="init" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.394178 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b39283-3b82-427d-a3bc-a2d64232222b" containerName="dnsmasq-dns" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.395409 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.439121 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.481302 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55df45f4c7-62whh"] Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.501805 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-scripts\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.502867 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-horizon-secret-key\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.502953 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-logs\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.503007 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlmgl\" (UniqueName: \"kubernetes.io/projected/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-kube-api-access-hlmgl\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.503178 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-config-data\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.593824 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.609221 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-scripts\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.609296 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-horizon-secret-key\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.609356 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-logs\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.609399 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlmgl\" (UniqueName: \"kubernetes.io/projected/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-kube-api-access-hlmgl\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.609480 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-config-data\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.610809 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-config-data\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.611387 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-scripts\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.626923 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-logs\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.643813 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlmgl\" (UniqueName: \"kubernetes.io/projected/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-kube-api-access-hlmgl\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.718853 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-horizon-secret-key\") pod \"horizon-55df45f4c7-62whh\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.744766 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.825748 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.922964 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfvs5\" (UniqueName: \"kubernetes.io/projected/9a20155b-6ff9-4250-8b0c-38b41c21c67c-kube-api-access-hfvs5\") pod \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.923674 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-swift-storage-0\") pod \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.923749 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-sb\") pod \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.923829 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-nb\") pod \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.923879 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-svc\") pod \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.924007 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-config\") pod \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\" (UID: \"9a20155b-6ff9-4250-8b0c-38b41c21c67c\") " Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.954326 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a20155b-6ff9-4250-8b0c-38b41c21c67c-kube-api-access-hfvs5" (OuterVolumeSpecName: "kube-api-access-hfvs5") pod "9a20155b-6ff9-4250-8b0c-38b41c21c67c" (UID: "9a20155b-6ff9-4250-8b0c-38b41c21c67c"). InnerVolumeSpecName "kube-api-access-hfvs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.954790 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a20155b-6ff9-4250-8b0c-38b41c21c67c" (UID: "9a20155b-6ff9-4250-8b0c-38b41c21c67c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.961212 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a20155b-6ff9-4250-8b0c-38b41c21c67c" (UID: "9a20155b-6ff9-4250-8b0c-38b41c21c67c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.964169 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-config" (OuterVolumeSpecName: "config") pod "9a20155b-6ff9-4250-8b0c-38b41c21c67c" (UID: "9a20155b-6ff9-4250-8b0c-38b41c21c67c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.973369 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a20155b-6ff9-4250-8b0c-38b41c21c67c" (UID: "9a20155b-6ff9-4250-8b0c-38b41c21c67c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:48 crc kubenswrapper[4779]: I0320 15:44:48.974394 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a20155b-6ff9-4250-8b0c-38b41c21c67c" (UID: "9a20155b-6ff9-4250-8b0c-38b41c21c67c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.026577 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.026853 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.026865 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.026877 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.026889 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a20155b-6ff9-4250-8b0c-38b41c21c67c-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.026900 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfvs5\" (UniqueName: \"kubernetes.io/projected/9a20155b-6ff9-4250-8b0c-38b41c21c67c-kube-api-access-hfvs5\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.122929 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" event={"ID":"9a20155b-6ff9-4250-8b0c-38b41c21c67c","Type":"ContainerDied","Data":"f8539e521bd80e9533d303d30e49861f81ddb2b803ae27fd1bfca12d1fa85de0"} Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.123006 4779 scope.go:117] "RemoveContainer" containerID="61293940300d0b90f8a67f9082389497dedb719f7c51241f5b40480b9f3be79d" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.123342 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.131362 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" event={"ID":"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc","Type":"ContainerStarted","Data":"f3c0c28a55ee01257be1b799d96d4911fcf3ff1e695f3804bfbaa4980b47cb2d"} Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.131622 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.163536 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" podStartSLOduration=4.1635131340000004 podStartE2EDuration="4.163513134s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:49.152442782 +0000 UTC m=+1306.114958592" watchObservedRunningTime="2026-03-20 15:44:49.163513134 +0000 UTC m=+1306.126028934" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.223309 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r"] Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.255846 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-mfj9r"] Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.841634 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a20155b-6ff9-4250-8b0c-38b41c21c67c" path="/var/lib/kubelet/pods/9a20155b-6ff9-4250-8b0c-38b41c21c67c/volumes" Mar 20 15:44:49 crc kubenswrapper[4779]: I0320 15:44:49.884154 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55df45f4c7-62whh"] Mar 20 15:44:49 crc kubenswrapper[4779]: W0320 15:44:49.938845 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcc5ad98_f8b2_4245_b08a_bb1dea245cd7.slice/crio-40fa83dac3e14dbced116dd441aba2ba644976e4f13008dbbb9b0cba73e7621d WatchSource:0}: Error finding container 40fa83dac3e14dbced116dd441aba2ba644976e4f13008dbbb9b0cba73e7621d: Status 404 returned error can't find the container with id 40fa83dac3e14dbced116dd441aba2ba644976e4f13008dbbb9b0cba73e7621d Mar 20 15:44:50 crc kubenswrapper[4779]: I0320 15:44:50.158489 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55df45f4c7-62whh" event={"ID":"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7","Type":"ContainerStarted","Data":"40fa83dac3e14dbced116dd441aba2ba644976e4f13008dbbb9b0cba73e7621d"} Mar 20 15:44:50 crc kubenswrapper[4779]: I0320 15:44:50.160354 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112","Type":"ContainerStarted","Data":"15084ae1d98532e7e24a8ca3aac9303eb2ef9c32228ce0a1ca12f044aded409b"} Mar 20 15:44:50 crc kubenswrapper[4779]: I0320 15:44:50.167672 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1f5c56d-a2c4-429c-8eae-5d6b3644b297","Type":"ContainerStarted","Data":"f046358f56f5ac4dca48bac3b2b26d48340dd27051acf36351a325d809924079"} Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.188181 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerStarted","Data":"cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470"} Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.188711 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerStarted","Data":"9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063"} Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.202199 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerName="glance-httpd" containerID="cri-o://dba179e5b6e18034f2e9d233316cade5d56aab7ea692758aa82200a3f333bbec" gracePeriod=30 Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.202187 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerName="glance-log" containerID="cri-o://f046358f56f5ac4dca48bac3b2b26d48340dd27051acf36351a325d809924079" gracePeriod=30 Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.202137 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1f5c56d-a2c4-429c-8eae-5d6b3644b297","Type":"ContainerStarted","Data":"dba179e5b6e18034f2e9d233316cade5d56aab7ea692758aa82200a3f333bbec"} Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.210269 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112","Type":"ContainerStarted","Data":"93b0f62abc315372b7cce0a2ed5b861cd7f1d724608d382b6a8cc9c75633c9f7"} Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.210544 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerName="glance-log" containerID="cri-o://15084ae1d98532e7e24a8ca3aac9303eb2ef9c32228ce0a1ca12f044aded409b" gracePeriod=30 Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.214358 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerName="glance-httpd" containerID="cri-o://93b0f62abc315372b7cce0a2ed5b861cd7f1d724608d382b6a8cc9c75633c9f7" gracePeriod=30 Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.233314 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=28.233291375 podStartE2EDuration="28.233291375s" podCreationTimestamp="2026-03-20 15:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:51.217497826 +0000 UTC m=+1308.180013626" watchObservedRunningTime="2026-03-20 15:44:51.233291375 +0000 UTC m=+1308.195807175" Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.286547 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.286527484 podStartE2EDuration="6.286527484s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:51.25420674 +0000 UTC m=+1308.216722550" watchObservedRunningTime="2026-03-20 15:44:51.286527484 +0000 UTC m=+1308.249043284" Mar 20 15:44:51 crc kubenswrapper[4779]: I0320 15:44:51.293475 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.293450052 podStartE2EDuration="6.293450052s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:51.277862307 +0000 UTC m=+1308.240378107" watchObservedRunningTime="2026-03-20 15:44:51.293450052 +0000 UTC m=+1308.255965852" Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.226293 4779 generic.go:334] "Generic (PLEG): container finished" podID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerID="dba179e5b6e18034f2e9d233316cade5d56aab7ea692758aa82200a3f333bbec" exitCode=0 Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.226399 4779 generic.go:334] "Generic (PLEG): container finished" podID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerID="f046358f56f5ac4dca48bac3b2b26d48340dd27051acf36351a325d809924079" exitCode=143 Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.226355 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1f5c56d-a2c4-429c-8eae-5d6b3644b297","Type":"ContainerDied","Data":"dba179e5b6e18034f2e9d233316cade5d56aab7ea692758aa82200a3f333bbec"} Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.226743 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1f5c56d-a2c4-429c-8eae-5d6b3644b297","Type":"ContainerDied","Data":"f046358f56f5ac4dca48bac3b2b26d48340dd27051acf36351a325d809924079"} Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.236680 4779 generic.go:334] "Generic (PLEG): container finished" podID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerID="93b0f62abc315372b7cce0a2ed5b861cd7f1d724608d382b6a8cc9c75633c9f7" exitCode=143 Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.236718 4779 generic.go:334] "Generic (PLEG): container finished" podID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerID="15084ae1d98532e7e24a8ca3aac9303eb2ef9c32228ce0a1ca12f044aded409b" exitCode=143 Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.236771 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112","Type":"ContainerDied","Data":"93b0f62abc315372b7cce0a2ed5b861cd7f1d724608d382b6a8cc9c75633c9f7"} Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.236852 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112","Type":"ContainerDied","Data":"15084ae1d98532e7e24a8ca3aac9303eb2ef9c32228ce0a1ca12f044aded409b"} Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.240571 4779 generic.go:334] "Generic (PLEG): container finished" podID="a04e9147-27c2-436d-8225-14693475e475" containerID="a85465ebf729ed3710c438a3eba2e7951d3eff1d01d3855caf6846980f2c8f14" exitCode=0 Mar 20 15:44:52 crc kubenswrapper[4779]: I0320 15:44:52.240834 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jh4bv" event={"ID":"a04e9147-27c2-436d-8225-14693475e475","Type":"ContainerDied","Data":"a85465ebf729ed3710c438a3eba2e7951d3eff1d01d3855caf6846980f2c8f14"} Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.197957 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.198457 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.209776 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.270305 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.571162 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-854bf9d4d7-f9lw8"] Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.639856 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f86b78896-7tfsm"] Mar 20 15:44:54 crc kubenswrapper[4779]: E0320 15:44:54.640348 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a20155b-6ff9-4250-8b0c-38b41c21c67c" containerName="init" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.640365 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a20155b-6ff9-4250-8b0c-38b41c21c67c" containerName="init" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.640527 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a20155b-6ff9-4250-8b0c-38b41c21c67c" containerName="init" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.641486 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.643965 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.654480 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f86b78896-7tfsm"] Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.700559 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55df45f4c7-62whh"] Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.728286 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dc78779bd-shkrw"] Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.730332 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.772595 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dc78779bd-shkrw"] Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.778175 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-tls-certs\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.778239 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-config-data\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.778265 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-secret-key\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.778305 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-combined-ca-bundle\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.778335 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6de60e1-9be1-4bb6-9d26-0b496117ed20-logs\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.778414 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt5mm\" (UniqueName: \"kubernetes.io/projected/f6de60e1-9be1-4bb6-9d26-0b496117ed20-kube-api-access-bt5mm\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.778439 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-scripts\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880229 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-tls-certs\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880299 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-config-data\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880328 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcfcz\" (UniqueName: \"kubernetes.io/projected/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-kube-api-access-bcfcz\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880354 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-secret-key\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880393 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-combined-ca-bundle\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880423 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-combined-ca-bundle\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880548 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-horizon-secret-key\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880595 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6de60e1-9be1-4bb6-9d26-0b496117ed20-logs\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880643 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-logs\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880874 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-config-data\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880939 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt5mm\" (UniqueName: \"kubernetes.io/projected/f6de60e1-9be1-4bb6-9d26-0b496117ed20-kube-api-access-bt5mm\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.880971 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-horizon-tls-certs\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.881002 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-scripts\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.881021 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-scripts\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.881095 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6de60e1-9be1-4bb6-9d26-0b496117ed20-logs\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.881797 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-scripts\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.881989 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-config-data\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.888099 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-combined-ca-bundle\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.888488 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-secret-key\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.899915 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt5mm\" (UniqueName: \"kubernetes.io/projected/f6de60e1-9be1-4bb6-9d26-0b496117ed20-kube-api-access-bt5mm\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.911979 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-tls-certs\") pod \"horizon-7f86b78896-7tfsm\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.982325 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-config-data\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.982667 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-horizon-tls-certs\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.982696 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-scripts\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.982756 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcfcz\" (UniqueName: \"kubernetes.io/projected/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-kube-api-access-bcfcz\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.982790 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-combined-ca-bundle\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.982828 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-horizon-secret-key\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.982851 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-logs\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.983987 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-scripts\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.984436 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-config-data\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.985007 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-logs\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.989005 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-combined-ca-bundle\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.989355 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:44:54 crc kubenswrapper[4779]: I0320 15:44:54.990661 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-horizon-secret-key\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.000796 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-horizon-tls-certs\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.002733 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcfcz\" (UniqueName: \"kubernetes.io/projected/dddea4aa-aba6-49f1-a8bc-6ce9850da26d-kube-api-access-bcfcz\") pod \"horizon-dc78779bd-shkrw\" (UID: \"dddea4aa-aba6-49f1-a8bc-6ce9850da26d\") " pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.087095 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.692197 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.702825 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.811883 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.811932 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-combined-ca-bundle\") pod \"a04e9147-27c2-436d-8225-14693475e475\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.811959 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-httpd-run\") pod \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.811978 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-logs\") pod \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.811999 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-scripts\") pod \"a04e9147-27c2-436d-8225-14693475e475\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812064 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-scripts\") pod \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812129 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-combined-ca-bundle\") pod \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812199 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-config-data\") pod \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812220 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44q8p\" (UniqueName: \"kubernetes.io/projected/a04e9147-27c2-436d-8225-14693475e475-kube-api-access-44q8p\") pod \"a04e9147-27c2-436d-8225-14693475e475\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812248 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-credential-keys\") pod \"a04e9147-27c2-436d-8225-14693475e475\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812279 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-fernet-keys\") pod \"a04e9147-27c2-436d-8225-14693475e475\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812317 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-public-tls-certs\") pod \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812333 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pph8g\" (UniqueName: \"kubernetes.io/projected/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-kube-api-access-pph8g\") pod \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\" (UID: \"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812403 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-config-data\") pod \"a04e9147-27c2-436d-8225-14693475e475\" (UID: \"a04e9147-27c2-436d-8225-14693475e475\") " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.812788 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" (UID: "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.817430 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a04e9147-27c2-436d-8225-14693475e475" (UID: "a04e9147-27c2-436d-8225-14693475e475"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.818333 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-logs" (OuterVolumeSpecName: "logs") pod "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" (UID: "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.819478 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" (UID: "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.821140 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04e9147-27c2-436d-8225-14693475e475-kube-api-access-44q8p" (OuterVolumeSpecName: "kube-api-access-44q8p") pod "a04e9147-27c2-436d-8225-14693475e475" (UID: "a04e9147-27c2-436d-8225-14693475e475"). InnerVolumeSpecName "kube-api-access-44q8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.821611 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a04e9147-27c2-436d-8225-14693475e475" (UID: "a04e9147-27c2-436d-8225-14693475e475"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.823318 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-scripts" (OuterVolumeSpecName: "scripts") pod "a04e9147-27c2-436d-8225-14693475e475" (UID: "a04e9147-27c2-436d-8225-14693475e475"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.823321 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-kube-api-access-pph8g" (OuterVolumeSpecName: "kube-api-access-pph8g") pod "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" (UID: "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112"). InnerVolumeSpecName "kube-api-access-pph8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.825273 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-scripts" (OuterVolumeSpecName: "scripts") pod "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" (UID: "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.850213 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a04e9147-27c2-436d-8225-14693475e475" (UID: "a04e9147-27c2-436d-8225-14693475e475"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.866779 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-config-data" (OuterVolumeSpecName: "config-data") pod "a04e9147-27c2-436d-8225-14693475e475" (UID: "a04e9147-27c2-436d-8225-14693475e475"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.870713 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" (UID: "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.884963 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-config-data" (OuterVolumeSpecName: "config-data") pod "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" (UID: "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.886901 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" (UID: "cbcbfd71-bd10-4cb7-8d1d-50d3419a4112"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916000 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916029 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916832 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916842 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916853 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916862 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44q8p\" (UniqueName: \"kubernetes.io/projected/a04e9147-27c2-436d-8225-14693475e475-kube-api-access-44q8p\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916870 4779 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916878 4779 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916886 4779 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916894 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pph8g\" (UniqueName: \"kubernetes.io/projected/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-kube-api-access-pph8g\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916902 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916924 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916932 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a04e9147-27c2-436d-8225-14693475e475-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.916941 4779 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:55 crc kubenswrapper[4779]: I0320 15:44:55.939482 4779 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.019016 4779 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.266256 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.347275 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-cxhjs"] Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.348549 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="dnsmasq-dns" containerID="cri-o://a86073d489d2cc92b4899bdc62d5e012f582ac9a3c5bd9bec6e9f396d3100381" gracePeriod=10 Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.365860 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jh4bv" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.365863 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jh4bv" event={"ID":"a04e9147-27c2-436d-8225-14693475e475","Type":"ContainerDied","Data":"90cef75bddb9defece4af0ab6e79550b4cbf096189e3d74bc41e78ba7730d8dc"} Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.366005 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90cef75bddb9defece4af0ab6e79550b4cbf096189e3d74bc41e78ba7730d8dc" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.370418 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbcbfd71-bd10-4cb7-8d1d-50d3419a4112","Type":"ContainerDied","Data":"c52acf87901d8eb85ec2cb096b528ac1911f2efc9fa965954acf73ef0f92a86a"} Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.370474 4779 scope.go:117] "RemoveContainer" containerID="93b0f62abc315372b7cce0a2ed5b861cd7f1d724608d382b6a8cc9c75633c9f7" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.370585 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.472299 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.494858 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.515184 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:44:56 crc kubenswrapper[4779]: E0320 15:44:56.515625 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerName="glance-httpd" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.515644 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerName="glance-httpd" Mar 20 15:44:56 crc kubenswrapper[4779]: E0320 15:44:56.515685 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerName="glance-log" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.515694 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerName="glance-log" Mar 20 15:44:56 crc kubenswrapper[4779]: E0320 15:44:56.515706 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04e9147-27c2-436d-8225-14693475e475" containerName="keystone-bootstrap" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.515715 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04e9147-27c2-436d-8225-14693475e475" containerName="keystone-bootstrap" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.515909 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerName="glance-httpd" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.515924 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="a04e9147-27c2-436d-8225-14693475e475" containerName="keystone-bootstrap" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.515936 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" containerName="glance-log" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.516897 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.519583 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.519816 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.526696 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.630542 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-scripts\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.630623 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.630655 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t264\" (UniqueName: \"kubernetes.io/projected/148e12da-09c4-454f-ac34-76059964f559-kube-api-access-9t264\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.630749 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-logs\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.630828 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.630854 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.630947 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.630985 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-config-data\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.732804 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-scripts\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.732870 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.732888 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t264\" (UniqueName: \"kubernetes.io/projected/148e12da-09c4-454f-ac34-76059964f559-kube-api-access-9t264\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.732929 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-logs\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.732983 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.733002 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.733023 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.733045 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-config-data\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.733393 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.735183 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-logs\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.735734 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.746453 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-config-data\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.747526 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-scripts\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.761797 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.763963 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.776245 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t264\" (UniqueName: \"kubernetes.io/projected/148e12da-09c4-454f-ac34-76059964f559-kube-api-access-9t264\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.796502 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.841179 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.887333 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jh4bv"] Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.896552 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jh4bv"] Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.981313 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rh9gx"] Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.982605 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.984351 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.986700 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.986877 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.987006 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.987140 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lrbdn" Mar 20 15:44:56 crc kubenswrapper[4779]: I0320 15:44:56.988915 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rh9gx"] Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.044531 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-config-data\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.044578 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-credential-keys\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.044625 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/e815c919-fbc1-4307-912e-13c74f996abb-kube-api-access-bt9pm\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.044650 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-scripts\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.044694 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-combined-ca-bundle\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.044757 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-fernet-keys\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.146540 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-fernet-keys\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.146668 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-config-data\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.146693 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-credential-keys\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.146739 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/e815c919-fbc1-4307-912e-13c74f996abb-kube-api-access-bt9pm\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.146768 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-scripts\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.146810 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-combined-ca-bundle\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.150869 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-combined-ca-bundle\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.151699 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-config-data\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.152135 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-fernet-keys\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.160539 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-scripts\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.161832 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-credential-keys\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.165766 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/e815c919-fbc1-4307-912e-13c74f996abb-kube-api-access-bt9pm\") pod \"keystone-bootstrap-rh9gx\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.342142 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.384095 4779 generic.go:334] "Generic (PLEG): container finished" podID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerID="a86073d489d2cc92b4899bdc62d5e012f582ac9a3c5bd9bec6e9f396d3100381" exitCode=0 Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.384161 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" event={"ID":"efac30ce-be74-429c-a7dc-25c46cfbc88e","Type":"ContainerDied","Data":"a86073d489d2cc92b4899bdc62d5e012f582ac9a3c5bd9bec6e9f396d3100381"} Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.820267 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a04e9147-27c2-436d-8225-14693475e475" path="/var/lib/kubelet/pods/a04e9147-27c2-436d-8225-14693475e475/volumes" Mar 20 15:44:57 crc kubenswrapper[4779]: I0320 15:44:57.821351 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbcbfd71-bd10-4cb7-8d1d-50d3419a4112" path="/var/lib/kubelet/pods/cbcbfd71-bd10-4cb7-8d1d-50d3419a4112/volumes" Mar 20 15:44:58 crc kubenswrapper[4779]: I0320 15:44:58.201563 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.134027 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s"] Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.135814 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.139444 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.139686 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.142750 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s"] Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.200732 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp866\" (UniqueName: \"kubernetes.io/projected/1a850ec0-7410-472b-b4da-19f990f2187f-kube-api-access-jp866\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.200820 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a850ec0-7410-472b-b4da-19f990f2187f-config-volume\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.200891 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a850ec0-7410-472b-b4da-19f990f2187f-secret-volume\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.302203 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp866\" (UniqueName: \"kubernetes.io/projected/1a850ec0-7410-472b-b4da-19f990f2187f-kube-api-access-jp866\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.302293 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a850ec0-7410-472b-b4da-19f990f2187f-config-volume\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.302633 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a850ec0-7410-472b-b4da-19f990f2187f-secret-volume\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.303490 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a850ec0-7410-472b-b4da-19f990f2187f-config-volume\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.309882 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a850ec0-7410-472b-b4da-19f990f2187f-secret-volume\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.320804 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp866\" (UniqueName: \"kubernetes.io/projected/1a850ec0-7410-472b-b4da-19f990f2187f-kube-api-access-jp866\") pod \"collect-profiles-29567025-g6d8s\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:00 crc kubenswrapper[4779]: I0320 15:45:00.464663 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:03 crc kubenswrapper[4779]: I0320 15:45:03.201050 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.820455 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.820902 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n98hcch64bh5cbh555hd5h66ch8fh66ch558h686h5dch89h5fh9bh59fh5d8h74hf9h685h5fch85h575h64ch545h686h55ch664h549h5f7hcch55fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6qwtp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-689d9485bf-d7qg8_openstack(0b61924e-c7b3-4017-9e77-9a170fdcda23): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.852099 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-689d9485bf-d7qg8" podUID="0b61924e-c7b3-4017-9e77-9a170fdcda23" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.879376 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.879556 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dchfdhf9h686h57dhf4h58ch96h5f7hcfhb6h595h8dh694hb6h596h5f6h58fh77h98hc8hf6hf6h64bh5dfh68bh66fh5cfh687h5bh57ch665q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-846nm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-854bf9d4d7-f9lw8_openstack(74512812-a6c7-40b6-bf73-155ed352ed3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.887610 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-854bf9d4d7-f9lw8" podUID="74512812-a6c7-40b6-bf73-155ed352ed3d" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.900346 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.900585 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fh696hb8h5dch656h8bh66fh5cbh5b7h569h54bh5fch95h85h599h67dh8ch5f5h66ch5fdhcfh5d5hdfh55h56ch657h576h5b4h9fhcch99h94q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hlmgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-55df45f4c7-62whh_openstack(dcc5ad98-f8b2-4245-b08a-bb1dea245cd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:45:05 crc kubenswrapper[4779]: E0320 15:45:05.903251 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-55df45f4c7-62whh" podUID="dcc5ad98-f8b2-4245-b08a-bb1dea245cd7" Mar 20 15:45:05 crc kubenswrapper[4779]: I0320 15:45:05.958080 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.008205 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-scripts\") pod \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.008352 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-combined-ca-bundle\") pod \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.008380 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-httpd-run\") pod \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.008420 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vplfz\" (UniqueName: \"kubernetes.io/projected/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-kube-api-access-vplfz\") pod \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.008454 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.008529 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-logs\") pod \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.008543 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-config-data\") pod \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.008576 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-internal-tls-certs\") pod \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\" (UID: \"d1f5c56d-a2c4-429c-8eae-5d6b3644b297\") " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.009247 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-logs" (OuterVolumeSpecName: "logs") pod "d1f5c56d-a2c4-429c-8eae-5d6b3644b297" (UID: "d1f5c56d-a2c4-429c-8eae-5d6b3644b297"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.009327 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d1f5c56d-a2c4-429c-8eae-5d6b3644b297" (UID: "d1f5c56d-a2c4-429c-8eae-5d6b3644b297"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.022524 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-scripts" (OuterVolumeSpecName: "scripts") pod "d1f5c56d-a2c4-429c-8eae-5d6b3644b297" (UID: "d1f5c56d-a2c4-429c-8eae-5d6b3644b297"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.022899 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-kube-api-access-vplfz" (OuterVolumeSpecName: "kube-api-access-vplfz") pod "d1f5c56d-a2c4-429c-8eae-5d6b3644b297" (UID: "d1f5c56d-a2c4-429c-8eae-5d6b3644b297"). InnerVolumeSpecName "kube-api-access-vplfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.023080 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "d1f5c56d-a2c4-429c-8eae-5d6b3644b297" (UID: "d1f5c56d-a2c4-429c-8eae-5d6b3644b297"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.049016 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1f5c56d-a2c4-429c-8eae-5d6b3644b297" (UID: "d1f5c56d-a2c4-429c-8eae-5d6b3644b297"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.073121 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d1f5c56d-a2c4-429c-8eae-5d6b3644b297" (UID: "d1f5c56d-a2c4-429c-8eae-5d6b3644b297"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.087822 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-config-data" (OuterVolumeSpecName: "config-data") pod "d1f5c56d-a2c4-429c-8eae-5d6b3644b297" (UID: "d1f5c56d-a2c4-429c-8eae-5d6b3644b297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.110666 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.110698 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.110708 4779 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.110720 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.110730 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.110767 4779 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.110777 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vplfz\" (UniqueName: \"kubernetes.io/projected/d1f5c56d-a2c4-429c-8eae-5d6b3644b297-kube-api-access-vplfz\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.110808 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.127746 4779 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.212733 4779 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.459629 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.459685 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1f5c56d-a2c4-429c-8eae-5d6b3644b297","Type":"ContainerDied","Data":"38bb05cb7a17ad0763c921755d8ad3221d4a0c900e6faa2e6fba104e7c4adb35"} Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.517634 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.524961 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.558698 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:45:06 crc kubenswrapper[4779]: E0320 15:45:06.559125 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerName="glance-httpd" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.559140 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerName="glance-httpd" Mar 20 15:45:06 crc kubenswrapper[4779]: E0320 15:45:06.559168 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerName="glance-log" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.559176 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerName="glance-log" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.559410 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerName="glance-log" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.559437 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" containerName="glance-httpd" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.560645 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.563641 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.564180 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.577675 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.620894 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsfcc\" (UniqueName: \"kubernetes.io/projected/205b1474-e402-4bd4-8ef7-4907c61f11bb-kube-api-access-xsfcc\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.620969 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.621128 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.621164 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.621190 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-logs\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.621236 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.621273 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.621325 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.723474 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsfcc\" (UniqueName: \"kubernetes.io/projected/205b1474-e402-4bd4-8ef7-4907c61f11bb-kube-api-access-xsfcc\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.724041 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.724145 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.724179 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.724208 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-logs\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.724235 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.724308 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.724410 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.724979 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.725669 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-logs\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.725880 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.729462 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.729552 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.730438 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.734256 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.747270 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsfcc\" (UniqueName: \"kubernetes.io/projected/205b1474-e402-4bd4-8ef7-4907c61f11bb-kube-api-access-xsfcc\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.759962 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: I0320 15:45:06.882815 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:06 crc kubenswrapper[4779]: E0320 15:45:06.916729 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 15:45:06 crc kubenswrapper[4779]: E0320 15:45:06.916895 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gbnsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xjstn_openstack(b67fdca5-13d1-4e83-8834-03856ca956b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:45:06 crc kubenswrapper[4779]: E0320 15:45:06.918133 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xjstn" podUID="b67fdca5-13d1-4e83-8834-03856ca956b5" Mar 20 15:45:07 crc kubenswrapper[4779]: E0320 15:45:07.471290 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xjstn" podUID="b67fdca5-13d1-4e83-8834-03856ca956b5" Mar 20 15:45:07 crc kubenswrapper[4779]: I0320 15:45:07.838363 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f5c56d-a2c4-429c-8eae-5d6b3644b297" path="/var/lib/kubelet/pods/d1f5c56d-a2c4-429c-8eae-5d6b3644b297/volumes" Mar 20 15:45:09 crc kubenswrapper[4779]: I0320 15:45:09.486499 4779 generic.go:334] "Generic (PLEG): container finished" podID="d35f1456-b579-484e-bbb4-75ced186fdda" containerID="fcd9f636bf9ef211a785824ec24e9742f165fa40de9f5c35b3b32d8950658468" exitCode=0 Mar 20 15:45:09 crc kubenswrapper[4779]: I0320 15:45:09.486549 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gwmgm" event={"ID":"d35f1456-b579-484e-bbb4-75ced186fdda","Type":"ContainerDied","Data":"fcd9f636bf9ef211a785824ec24e9742f165fa40de9f5c35b3b32d8950658468"} Mar 20 15:45:13 crc kubenswrapper[4779]: I0320 15:45:13.201000 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Mar 20 15:45:13 crc kubenswrapper[4779]: I0320 15:45:13.201749 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:45:13 crc kubenswrapper[4779]: I0320 15:45:13.982369 4779 scope.go:117] "RemoveContainer" containerID="15084ae1d98532e7e24a8ca3aac9303eb2ef9c32228ce0a1ca12f044aded409b" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.142916 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.154713 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.167936 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.191479 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.192657 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.276151 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-config-data\") pod \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277447 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlmgl\" (UniqueName: \"kubernetes.io/projected/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-kube-api-access-hlmgl\") pod \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277503 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-config\") pod \"efac30ce-be74-429c-a7dc-25c46cfbc88e\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277568 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-scripts\") pod \"0b61924e-c7b3-4017-9e77-9a170fdcda23\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277589 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-logs\") pod \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277647 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b61924e-c7b3-4017-9e77-9a170fdcda23-horizon-secret-key\") pod \"0b61924e-c7b3-4017-9e77-9a170fdcda23\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277692 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-config-data\") pod \"0b61924e-c7b3-4017-9e77-9a170fdcda23\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277729 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-horizon-secret-key\") pod \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277756 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qwtp\" (UniqueName: \"kubernetes.io/projected/0b61924e-c7b3-4017-9e77-9a170fdcda23-kube-api-access-6qwtp\") pod \"0b61924e-c7b3-4017-9e77-9a170fdcda23\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277793 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-svc\") pod \"efac30ce-be74-429c-a7dc-25c46cfbc88e\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277922 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-swift-storage-0\") pod \"efac30ce-be74-429c-a7dc-25c46cfbc88e\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.277963 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-sb\") pod \"efac30ce-be74-429c-a7dc-25c46cfbc88e\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278011 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vr7n\" (UniqueName: \"kubernetes.io/projected/efac30ce-be74-429c-a7dc-25c46cfbc88e-kube-api-access-4vr7n\") pod \"efac30ce-be74-429c-a7dc-25c46cfbc88e\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278031 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-scripts\") pod \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\" (UID: \"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278072 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-nb\") pod \"efac30ce-be74-429c-a7dc-25c46cfbc88e\" (UID: \"efac30ce-be74-429c-a7dc-25c46cfbc88e\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278095 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b61924e-c7b3-4017-9e77-9a170fdcda23-logs\") pod \"0b61924e-c7b3-4017-9e77-9a170fdcda23\" (UID: \"0b61924e-c7b3-4017-9e77-9a170fdcda23\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278205 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-logs" (OuterVolumeSpecName: "logs") pod "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7" (UID: "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278219 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-scripts" (OuterVolumeSpecName: "scripts") pod "0b61924e-c7b3-4017-9e77-9a170fdcda23" (UID: "0b61924e-c7b3-4017-9e77-9a170fdcda23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278573 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-scripts" (OuterVolumeSpecName: "scripts") pod "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7" (UID: "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278723 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-config-data" (OuterVolumeSpecName: "config-data") pod "0b61924e-c7b3-4017-9e77-9a170fdcda23" (UID: "0b61924e-c7b3-4017-9e77-9a170fdcda23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278833 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278855 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278866 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b61924e-c7b3-4017-9e77-9a170fdcda23-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.278875 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.279003 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b61924e-c7b3-4017-9e77-9a170fdcda23-logs" (OuterVolumeSpecName: "logs") pod "0b61924e-c7b3-4017-9e77-9a170fdcda23" (UID: "0b61924e-c7b3-4017-9e77-9a170fdcda23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.279241 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-config-data" (OuterVolumeSpecName: "config-data") pod "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7" (UID: "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.282175 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7" (UID: "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.282186 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b61924e-c7b3-4017-9e77-9a170fdcda23-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0b61924e-c7b3-4017-9e77-9a170fdcda23" (UID: "0b61924e-c7b3-4017-9e77-9a170fdcda23"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.283603 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efac30ce-be74-429c-a7dc-25c46cfbc88e-kube-api-access-4vr7n" (OuterVolumeSpecName: "kube-api-access-4vr7n") pod "efac30ce-be74-429c-a7dc-25c46cfbc88e" (UID: "efac30ce-be74-429c-a7dc-25c46cfbc88e"). InnerVolumeSpecName "kube-api-access-4vr7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.283645 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b61924e-c7b3-4017-9e77-9a170fdcda23-kube-api-access-6qwtp" (OuterVolumeSpecName: "kube-api-access-6qwtp") pod "0b61924e-c7b3-4017-9e77-9a170fdcda23" (UID: "0b61924e-c7b3-4017-9e77-9a170fdcda23"). InnerVolumeSpecName "kube-api-access-6qwtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.284348 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-kube-api-access-hlmgl" (OuterVolumeSpecName: "kube-api-access-hlmgl") pod "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7" (UID: "dcc5ad98-f8b2-4245-b08a-bb1dea245cd7"). InnerVolumeSpecName "kube-api-access-hlmgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.324173 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "efac30ce-be74-429c-a7dc-25c46cfbc88e" (UID: "efac30ce-be74-429c-a7dc-25c46cfbc88e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.327456 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "efac30ce-be74-429c-a7dc-25c46cfbc88e" (UID: "efac30ce-be74-429c-a7dc-25c46cfbc88e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.328743 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-config" (OuterVolumeSpecName: "config") pod "efac30ce-be74-429c-a7dc-25c46cfbc88e" (UID: "efac30ce-be74-429c-a7dc-25c46cfbc88e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.332193 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "efac30ce-be74-429c-a7dc-25c46cfbc88e" (UID: "efac30ce-be74-429c-a7dc-25c46cfbc88e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.334885 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "efac30ce-be74-429c-a7dc-25c46cfbc88e" (UID: "efac30ce-be74-429c-a7dc-25c46cfbc88e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.379880 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-config\") pod \"d35f1456-b579-484e-bbb4-75ced186fdda\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.379983 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv57h\" (UniqueName: \"kubernetes.io/projected/d35f1456-b579-484e-bbb4-75ced186fdda-kube-api-access-vv57h\") pod \"d35f1456-b579-484e-bbb4-75ced186fdda\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.380000 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-846nm\" (UniqueName: \"kubernetes.io/projected/74512812-a6c7-40b6-bf73-155ed352ed3d-kube-api-access-846nm\") pod \"74512812-a6c7-40b6-bf73-155ed352ed3d\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.380056 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74512812-a6c7-40b6-bf73-155ed352ed3d-logs\") pod \"74512812-a6c7-40b6-bf73-155ed352ed3d\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.380133 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74512812-a6c7-40b6-bf73-155ed352ed3d-horizon-secret-key\") pod \"74512812-a6c7-40b6-bf73-155ed352ed3d\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.380164 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-combined-ca-bundle\") pod \"d35f1456-b579-484e-bbb4-75ced186fdda\" (UID: \"d35f1456-b579-484e-bbb4-75ced186fdda\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.380361 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74512812-a6c7-40b6-bf73-155ed352ed3d-logs" (OuterVolumeSpecName: "logs") pod "74512812-a6c7-40b6-bf73-155ed352ed3d" (UID: "74512812-a6c7-40b6-bf73-155ed352ed3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.380803 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-scripts\") pod \"74512812-a6c7-40b6-bf73-155ed352ed3d\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.380846 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-config-data\") pod \"74512812-a6c7-40b6-bf73-155ed352ed3d\" (UID: \"74512812-a6c7-40b6-bf73-155ed352ed3d\") " Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381252 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-scripts" (OuterVolumeSpecName: "scripts") pod "74512812-a6c7-40b6-bf73-155ed352ed3d" (UID: "74512812-a6c7-40b6-bf73-155ed352ed3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381392 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381409 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381422 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vr7n\" (UniqueName: \"kubernetes.io/projected/efac30ce-be74-429c-a7dc-25c46cfbc88e-kube-api-access-4vr7n\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381433 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381443 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b61924e-c7b3-4017-9e77-9a170fdcda23-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381426 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-config-data" (OuterVolumeSpecName: "config-data") pod "74512812-a6c7-40b6-bf73-155ed352ed3d" (UID: "74512812-a6c7-40b6-bf73-155ed352ed3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381454 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74512812-a6c7-40b6-bf73-155ed352ed3d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381510 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381523 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381534 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlmgl\" (UniqueName: \"kubernetes.io/projected/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-kube-api-access-hlmgl\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381544 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381555 4779 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0b61924e-c7b3-4017-9e77-9a170fdcda23-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381564 4779 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381573 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qwtp\" (UniqueName: \"kubernetes.io/projected/0b61924e-c7b3-4017-9e77-9a170fdcda23-kube-api-access-6qwtp\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.381582 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efac30ce-be74-429c-a7dc-25c46cfbc88e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.383386 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74512812-a6c7-40b6-bf73-155ed352ed3d-kube-api-access-846nm" (OuterVolumeSpecName: "kube-api-access-846nm") pod "74512812-a6c7-40b6-bf73-155ed352ed3d" (UID: "74512812-a6c7-40b6-bf73-155ed352ed3d"). InnerVolumeSpecName "kube-api-access-846nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.385059 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74512812-a6c7-40b6-bf73-155ed352ed3d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "74512812-a6c7-40b6-bf73-155ed352ed3d" (UID: "74512812-a6c7-40b6-bf73-155ed352ed3d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.385410 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35f1456-b579-484e-bbb4-75ced186fdda-kube-api-access-vv57h" (OuterVolumeSpecName: "kube-api-access-vv57h") pod "d35f1456-b579-484e-bbb4-75ced186fdda" (UID: "d35f1456-b579-484e-bbb4-75ced186fdda"). InnerVolumeSpecName "kube-api-access-vv57h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.404907 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-config" (OuterVolumeSpecName: "config") pod "d35f1456-b579-484e-bbb4-75ced186fdda" (UID: "d35f1456-b579-484e-bbb4-75ced186fdda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.406011 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d35f1456-b579-484e-bbb4-75ced186fdda" (UID: "d35f1456-b579-484e-bbb4-75ced186fdda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.482827 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv57h\" (UniqueName: \"kubernetes.io/projected/d35f1456-b579-484e-bbb4-75ced186fdda-kube-api-access-vv57h\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.482865 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-846nm\" (UniqueName: \"kubernetes.io/projected/74512812-a6c7-40b6-bf73-155ed352ed3d-kube-api-access-846nm\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.482876 4779 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74512812-a6c7-40b6-bf73-155ed352ed3d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.482887 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.482898 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74512812-a6c7-40b6-bf73-155ed352ed3d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.482908 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d35f1456-b579-484e-bbb4-75ced186fdda-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.542780 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-854bf9d4d7-f9lw8" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.542802 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-854bf9d4d7-f9lw8" event={"ID":"74512812-a6c7-40b6-bf73-155ed352ed3d","Type":"ContainerDied","Data":"a45c1d984f55dc164a5900c8a8cb34b303153b2ffb8ec4c0869cae3cae577ecc"} Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.544001 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55df45f4c7-62whh" event={"ID":"dcc5ad98-f8b2-4245-b08a-bb1dea245cd7","Type":"ContainerDied","Data":"40fa83dac3e14dbced116dd441aba2ba644976e4f13008dbbb9b0cba73e7621d"} Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.544067 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55df45f4c7-62whh" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.560492 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689d9485bf-d7qg8" event={"ID":"0b61924e-c7b3-4017-9e77-9a170fdcda23","Type":"ContainerDied","Data":"0c1049f2b5ad642cb8de19a123af93af7614b7ddb66db680342c78ef26931fc5"} Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.560513 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689d9485bf-d7qg8" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.567011 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" event={"ID":"efac30ce-be74-429c-a7dc-25c46cfbc88e","Type":"ContainerDied","Data":"111b425a733eed56c8a13c0a826fd063cf19d6731be5e6737ab3edb2de72d7ca"} Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.567086 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.574386 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gwmgm" event={"ID":"d35f1456-b579-484e-bbb4-75ced186fdda","Type":"ContainerDied","Data":"3b211920ff0d6885e44b95c090b4422e89d3fc3050d95c133a5810dbd4fd0bd0"} Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.574416 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b211920ff0d6885e44b95c090b4422e89d3fc3050d95c133a5810dbd4fd0bd0" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.574466 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gwmgm" Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.660965 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55df45f4c7-62whh"] Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.692325 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55df45f4c7-62whh"] Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.791811 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-689d9485bf-d7qg8"] Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.834172 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-689d9485bf-d7qg8"] Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.867288 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-854bf9d4d7-f9lw8"] Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.876469 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-854bf9d4d7-f9lw8"] Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.884189 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-cxhjs"] Mar 20 15:45:14 crc kubenswrapper[4779]: I0320 15:45:14.893412 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-cxhjs"] Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.454994 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-gmd9z"] Mar 20 15:45:15 crc kubenswrapper[4779]: E0320 15:45:15.455420 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="dnsmasq-dns" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.455435 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="dnsmasq-dns" Mar 20 15:45:15 crc kubenswrapper[4779]: E0320 15:45:15.455457 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="init" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.455463 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="init" Mar 20 15:45:15 crc kubenswrapper[4779]: E0320 15:45:15.455493 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35f1456-b579-484e-bbb4-75ced186fdda" containerName="neutron-db-sync" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.455499 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35f1456-b579-484e-bbb4-75ced186fdda" containerName="neutron-db-sync" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.455708 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d35f1456-b579-484e-bbb4-75ced186fdda" containerName="neutron-db-sync" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.455726 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="dnsmasq-dns" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.456652 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.478230 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-gmd9z"] Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.604712 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.604838 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.604892 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-config\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.604931 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.605008 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ln4r\" (UniqueName: \"kubernetes.io/projected/629705fe-cbdc-4366-96ef-045d9d3c44dd-kube-api-access-7ln4r\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.605048 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.622266 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c5d7688b4-46tq6"] Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.623694 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.626535 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ggdr6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.627018 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.627527 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.627819 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.636010 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c5d7688b4-46tq6"] Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.707000 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ln4r\" (UniqueName: \"kubernetes.io/projected/629705fe-cbdc-4366-96ef-045d9d3c44dd-kube-api-access-7ln4r\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.707073 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.707239 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.707291 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.707315 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-config\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.707337 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.708348 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.708795 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.709047 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-config\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.709753 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.711698 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.735331 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ln4r\" (UniqueName: \"kubernetes.io/projected/629705fe-cbdc-4366-96ef-045d9d3c44dd-kube-api-access-7ln4r\") pod \"dnsmasq-dns-84b966f6c9-gmd9z\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.773965 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.808971 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-ovndb-tls-certs\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.809020 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrd2q\" (UniqueName: \"kubernetes.io/projected/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-kube-api-access-hrd2q\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.809070 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-config\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.809134 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-httpd-config\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.809219 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-combined-ca-bundle\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.828727 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b61924e-c7b3-4017-9e77-9a170fdcda23" path="/var/lib/kubelet/pods/0b61924e-c7b3-4017-9e77-9a170fdcda23/volumes" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.829557 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74512812-a6c7-40b6-bf73-155ed352ed3d" path="/var/lib/kubelet/pods/74512812-a6c7-40b6-bf73-155ed352ed3d/volumes" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.830435 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc5ad98-f8b2-4245-b08a-bb1dea245cd7" path="/var/lib/kubelet/pods/dcc5ad98-f8b2-4245-b08a-bb1dea245cd7/volumes" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.831194 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" path="/var/lib/kubelet/pods/efac30ce-be74-429c-a7dc-25c46cfbc88e/volumes" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.912266 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-config\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.912404 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-httpd-config\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.912517 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-combined-ca-bundle\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.912608 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-ovndb-tls-certs\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.912641 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrd2q\" (UniqueName: \"kubernetes.io/projected/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-kube-api-access-hrd2q\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.917890 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-httpd-config\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.920348 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-config\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.922989 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-ovndb-tls-certs\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.927254 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-combined-ca-bundle\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.931030 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrd2q\" (UniqueName: \"kubernetes.io/projected/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-kube-api-access-hrd2q\") pod \"neutron-7c5d7688b4-46tq6\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:15 crc kubenswrapper[4779]: I0320 15:45:15.941237 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:16 crc kubenswrapper[4779]: E0320 15:45:16.066694 4779 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 15:45:16 crc kubenswrapper[4779]: E0320 15:45:16.067186 4779 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p64mw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wjcm6_openstack(130a37da-c17e-48dd-8712-f87c67f01852): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:45:16 crc kubenswrapper[4779]: E0320 15:45:16.068355 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wjcm6" podUID="130a37da-c17e-48dd-8712-f87c67f01852" Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.159512 4779 scope.go:117] "RemoveContainer" containerID="dba179e5b6e18034f2e9d233316cade5d56aab7ea692758aa82200a3f333bbec" Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.363397 4779 scope.go:117] "RemoveContainer" containerID="f046358f56f5ac4dca48bac3b2b26d48340dd27051acf36351a325d809924079" Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.431536 4779 scope.go:117] "RemoveContainer" containerID="a86073d489d2cc92b4899bdc62d5e012f582ac9a3c5bd9bec6e9f396d3100381" Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.473849 4779 scope.go:117] "RemoveContainer" containerID="66c16e5a900e30619b326ef4614898c8d0611c6e4424f793d8f2a7b45617b533" Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.609490 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerStarted","Data":"b9138cce36f647241a32181c1827909361881d304cb66a73b0568ffc81e78207"} Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.621707 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mx4bn" event={"ID":"85c5c01b-85a0-4714-81fe-31192c87b2fa","Type":"ContainerStarted","Data":"423c6d89080e00ce6b78d4e124a45d4b8c633d650eaea6de28b851a550f00412"} Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.640454 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-4qv4b" event={"ID":"8d89bfb9-3cdc-4e93-899a-361f5d5cf408","Type":"ContainerStarted","Data":"7a94844cceb07beea29a43faeb9b773344756eeb64c078c6cc8b1e72db3107e8"} Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.663438 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mx4bn" podStartSLOduration=5.219539419 podStartE2EDuration="31.663404434s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="2026-03-20 15:44:47.564671804 +0000 UTC m=+1304.527187604" lastFinishedPulling="2026-03-20 15:45:14.008536819 +0000 UTC m=+1330.971052619" observedRunningTime="2026-03-20 15:45:16.648793833 +0000 UTC m=+1333.611309633" watchObservedRunningTime="2026-03-20 15:45:16.663404434 +0000 UTC m=+1333.625920234" Mar 20 15:45:16 crc kubenswrapper[4779]: E0320 15:45:16.667811 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-wjcm6" podUID="130a37da-c17e-48dd-8712-f87c67f01852" Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.680531 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-4qv4b" podStartSLOduration=6.483322963 podStartE2EDuration="58.680509054s" podCreationTimestamp="2026-03-20 15:44:18 +0000 UTC" firstStartedPulling="2026-03-20 15:44:23.816638673 +0000 UTC m=+1280.779154473" lastFinishedPulling="2026-03-20 15:45:16.013824764 +0000 UTC m=+1332.976340564" observedRunningTime="2026-03-20 15:45:16.665617065 +0000 UTC m=+1333.628132865" watchObservedRunningTime="2026-03-20 15:45:16.680509054 +0000 UTC m=+1333.643024854" Mar 20 15:45:16 crc kubenswrapper[4779]: W0320 15:45:16.704003 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddddea4aa_aba6_49f1_a8bc_6ce9850da26d.slice/crio-9b69c8ee7fafea50d9c4d0408c734a17387d5bfe543e404db40b301792ff942d WatchSource:0}: Error finding container 9b69c8ee7fafea50d9c4d0408c734a17387d5bfe543e404db40b301792ff942d: Status 404 returned error can't find the container with id 9b69c8ee7fafea50d9c4d0408c734a17387d5bfe543e404db40b301792ff942d Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.707203 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dc78779bd-shkrw"] Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.872879 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f86b78896-7tfsm"] Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.891549 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rh9gx"] Mar 20 15:45:16 crc kubenswrapper[4779]: I0320 15:45:16.901062 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.035785 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.082708 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s"] Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.248215 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c5d7688b4-46tq6"] Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.266969 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-gmd9z"] Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.717548 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5d7688b4-46tq6" event={"ID":"1f3be8d3-51e7-4a9b-9409-9aa999876aa3","Type":"ContainerStarted","Data":"70c09986161a5c03082e9fbc1128733b54c8ac13ec5de237bf517a1b5110f3a4"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.717909 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5d7688b4-46tq6" event={"ID":"1f3be8d3-51e7-4a9b-9409-9aa999876aa3","Type":"ContainerStarted","Data":"18dbd068df7524fb7305ca8210419fa7572d23eff517a83d98fb524d99a692e2"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.749325 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"148e12da-09c4-454f-ac34-76059964f559","Type":"ContainerStarted","Data":"d207f62e11fbc0829c0495eb9879b3f555145ffd6da2c77252099e1548a362ec"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.831823 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rh9gx" event={"ID":"e815c919-fbc1-4307-912e-13c74f996abb","Type":"ContainerStarted","Data":"4f7bd0a4b4334182e687bc8ec8bf581d58b9f2a63b8cd261ed653ee89e3f6ba3"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.831864 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rh9gx" event={"ID":"e815c919-fbc1-4307-912e-13c74f996abb","Type":"ContainerStarted","Data":"f83bd0651c19197fca12c4741ab388f4724d6e71d4d12f17267820301ac8f1a9"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.836745 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc78779bd-shkrw" event={"ID":"dddea4aa-aba6-49f1-a8bc-6ce9850da26d","Type":"ContainerStarted","Data":"6de0caecf2efb20ed469e804bf6dfac0573da44a599e9a92afbfc7b8dcb620ee"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.836788 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc78779bd-shkrw" event={"ID":"dddea4aa-aba6-49f1-a8bc-6ce9850da26d","Type":"ContainerStarted","Data":"9b69c8ee7fafea50d9c4d0408c734a17387d5bfe543e404db40b301792ff942d"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.841693 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" event={"ID":"629705fe-cbdc-4366-96ef-045d9d3c44dd","Type":"ContainerStarted","Data":"3460d7a9322b651e4389627e8272335d39b481e9c2c5b03e0c7f999f7df2333d"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.868429 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" event={"ID":"1a850ec0-7410-472b-b4da-19f990f2187f","Type":"ContainerStarted","Data":"7b2c3b08711e4854503ae839e1d3b2747112170ced23fc5279870d5de81a3953"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.868471 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" event={"ID":"1a850ec0-7410-472b-b4da-19f990f2187f","Type":"ContainerStarted","Data":"b03b90c733734e3133dc32a25b8922079132acfc620f4ccf3f4494fc1f562936"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.870744 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rh9gx" podStartSLOduration=21.870725909 podStartE2EDuration="21.870725909s" podCreationTimestamp="2026-03-20 15:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:17.858982961 +0000 UTC m=+1334.821498761" watchObservedRunningTime="2026-03-20 15:45:17.870725909 +0000 UTC m=+1334.833241709" Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.884193 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f86b78896-7tfsm" event={"ID":"f6de60e1-9be1-4bb6-9d26-0b496117ed20","Type":"ContainerStarted","Data":"8fc84f413e5c90fa5207c13168c14ef390b461f2680891e04427e0d0de9c6b04"} Mar 20 15:45:17 crc kubenswrapper[4779]: I0320 15:45:17.946161 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.201997 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-cxhjs" podUID="efac30ce-be74-429c-a7dc-25c46cfbc88e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.283194 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-86f74cdcdf-tmrx6"] Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.284933 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.290571 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.290822 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.335799 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86f74cdcdf-tmrx6"] Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.397015 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwcl\" (UniqueName: \"kubernetes.io/projected/9752eda4-2b46-4251-bec7-5cbca2f6fd57-kube-api-access-ggwcl\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.397080 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-httpd-config\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.400471 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-ovndb-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.400597 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-internal-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.400690 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-config\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.400720 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-public-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.400752 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-combined-ca-bundle\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.502923 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-ovndb-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.502999 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-internal-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.503042 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-config\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.503060 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-public-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.503081 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-combined-ca-bundle\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.503455 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwcl\" (UniqueName: \"kubernetes.io/projected/9752eda4-2b46-4251-bec7-5cbca2f6fd57-kube-api-access-ggwcl\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.503495 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-httpd-config\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.510229 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-ovndb-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.523895 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-public-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.524632 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-httpd-config\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.525496 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-config\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.525937 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwcl\" (UniqueName: \"kubernetes.io/projected/9752eda4-2b46-4251-bec7-5cbca2f6fd57-kube-api-access-ggwcl\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.528988 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-combined-ca-bundle\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.529825 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-internal-tls-certs\") pod \"neutron-86f74cdcdf-tmrx6\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.552300 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.924582 4779 generic.go:334] "Generic (PLEG): container finished" podID="1a850ec0-7410-472b-b4da-19f990f2187f" containerID="7b2c3b08711e4854503ae839e1d3b2747112170ced23fc5279870d5de81a3953" exitCode=0 Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.925076 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" event={"ID":"1a850ec0-7410-472b-b4da-19f990f2187f","Type":"ContainerDied","Data":"7b2c3b08711e4854503ae839e1d3b2747112170ced23fc5279870d5de81a3953"} Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.935708 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f86b78896-7tfsm" event={"ID":"f6de60e1-9be1-4bb6-9d26-0b496117ed20","Type":"ContainerStarted","Data":"7ff747c74330698ccefe1a731244c171b039473c2090a2067e21822cd9b07277"} Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.935752 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f86b78896-7tfsm" event={"ID":"f6de60e1-9be1-4bb6-9d26-0b496117ed20","Type":"ContainerStarted","Data":"d7fe244e02765ceb3be959118ca1a70a4ac93f217d62f041833169123707116a"} Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.947566 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5d7688b4-46tq6" event={"ID":"1f3be8d3-51e7-4a9b-9409-9aa999876aa3","Type":"ContainerStarted","Data":"42c82203c403e6671f0875f5254ea8223ba8f882f95bbb3e6395e2190134b8dc"} Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.948292 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.962764 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"148e12da-09c4-454f-ac34-76059964f559","Type":"ContainerStarted","Data":"5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae"} Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.974559 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7f86b78896-7tfsm" podStartSLOduration=24.162111259 podStartE2EDuration="24.97453458s" podCreationTimestamp="2026-03-20 15:44:54 +0000 UTC" firstStartedPulling="2026-03-20 15:45:16.881180513 +0000 UTC m=+1333.843696313" lastFinishedPulling="2026-03-20 15:45:17.693603824 +0000 UTC m=+1334.656119634" observedRunningTime="2026-03-20 15:45:18.964287707 +0000 UTC m=+1335.926803507" watchObservedRunningTime="2026-03-20 15:45:18.97453458 +0000 UTC m=+1335.937050380" Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.978558 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"205b1474-e402-4bd4-8ef7-4907c61f11bb","Type":"ContainerStarted","Data":"466971ea2fbd03d11f71d5cf66a01fec57a922cb325194c3f2f0cef1307a690f"} Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.987857 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc78779bd-shkrw" event={"ID":"dddea4aa-aba6-49f1-a8bc-6ce9850da26d","Type":"ContainerStarted","Data":"39a9fad48d0e4c0dd04bf2d355eab966c31621277f6d4350396c7996d505307e"} Mar 20 15:45:18 crc kubenswrapper[4779]: I0320 15:45:18.991704 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c5d7688b4-46tq6" podStartSLOduration=3.991691089 podStartE2EDuration="3.991691089s" podCreationTimestamp="2026-03-20 15:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:18.990317208 +0000 UTC m=+1335.952833028" watchObservedRunningTime="2026-03-20 15:45:18.991691089 +0000 UTC m=+1335.954206889" Mar 20 15:45:19 crc kubenswrapper[4779]: I0320 15:45:19.005467 4779 generic.go:334] "Generic (PLEG): container finished" podID="629705fe-cbdc-4366-96ef-045d9d3c44dd" containerID="8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5" exitCode=0 Mar 20 15:45:19 crc kubenswrapper[4779]: I0320 15:45:19.005555 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" event={"ID":"629705fe-cbdc-4366-96ef-045d9d3c44dd","Type":"ContainerDied","Data":"8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5"} Mar 20 15:45:19 crc kubenswrapper[4779]: I0320 15:45:19.040612 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-dc78779bd-shkrw" podStartSLOduration=24.534089023 podStartE2EDuration="25.040593051s" podCreationTimestamp="2026-03-20 15:44:54 +0000 UTC" firstStartedPulling="2026-03-20 15:45:16.705831489 +0000 UTC m=+1333.668347279" lastFinishedPulling="2026-03-20 15:45:17.212335507 +0000 UTC m=+1334.174851307" observedRunningTime="2026-03-20 15:45:19.029465028 +0000 UTC m=+1335.991980828" watchObservedRunningTime="2026-03-20 15:45:19.040593051 +0000 UTC m=+1336.003108851" Mar 20 15:45:19 crc kubenswrapper[4779]: I0320 15:45:19.291215 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86f74cdcdf-tmrx6"] Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.032433 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" event={"ID":"1a850ec0-7410-472b-b4da-19f990f2187f","Type":"ContainerDied","Data":"b03b90c733734e3133dc32a25b8922079132acfc620f4ccf3f4494fc1f562936"} Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.032682 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03b90c733734e3133dc32a25b8922079132acfc620f4ccf3f4494fc1f562936" Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.040364 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"205b1474-e402-4bd4-8ef7-4907c61f11bb","Type":"ContainerStarted","Data":"bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7"} Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.053135 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86f74cdcdf-tmrx6" event={"ID":"9752eda4-2b46-4251-bec7-5cbca2f6fd57","Type":"ContainerStarted","Data":"69c87555657377b80c29de56354670bb12217637843c097d42fa2fda2851052e"} Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.173287 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.263682 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a850ec0-7410-472b-b4da-19f990f2187f-config-volume\") pod \"1a850ec0-7410-472b-b4da-19f990f2187f\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.263749 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp866\" (UniqueName: \"kubernetes.io/projected/1a850ec0-7410-472b-b4da-19f990f2187f-kube-api-access-jp866\") pod \"1a850ec0-7410-472b-b4da-19f990f2187f\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.263798 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a850ec0-7410-472b-b4da-19f990f2187f-secret-volume\") pod \"1a850ec0-7410-472b-b4da-19f990f2187f\" (UID: \"1a850ec0-7410-472b-b4da-19f990f2187f\") " Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.265323 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a850ec0-7410-472b-b4da-19f990f2187f-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a850ec0-7410-472b-b4da-19f990f2187f" (UID: "1a850ec0-7410-472b-b4da-19f990f2187f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.274434 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a850ec0-7410-472b-b4da-19f990f2187f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a850ec0-7410-472b-b4da-19f990f2187f" (UID: "1a850ec0-7410-472b-b4da-19f990f2187f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.278681 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a850ec0-7410-472b-b4da-19f990f2187f-kube-api-access-jp866" (OuterVolumeSpecName: "kube-api-access-jp866") pod "1a850ec0-7410-472b-b4da-19f990f2187f" (UID: "1a850ec0-7410-472b-b4da-19f990f2187f"). InnerVolumeSpecName "kube-api-access-jp866". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.365991 4779 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a850ec0-7410-472b-b4da-19f990f2187f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.366020 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp866\" (UniqueName: \"kubernetes.io/projected/1a850ec0-7410-472b-b4da-19f990f2187f-kube-api-access-jp866\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:20 crc kubenswrapper[4779]: I0320 15:45:20.366033 4779 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a850ec0-7410-472b-b4da-19f990f2187f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:21 crc kubenswrapper[4779]: I0320 15:45:21.062098 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"148e12da-09c4-454f-ac34-76059964f559","Type":"ContainerStarted","Data":"0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d"} Mar 20 15:45:21 crc kubenswrapper[4779]: I0320 15:45:21.063684 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"205b1474-e402-4bd4-8ef7-4907c61f11bb","Type":"ContainerStarted","Data":"4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089"} Mar 20 15:45:21 crc kubenswrapper[4779]: I0320 15:45:21.064986 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86f74cdcdf-tmrx6" event={"ID":"9752eda4-2b46-4251-bec7-5cbca2f6fd57","Type":"ContainerStarted","Data":"8e8c1bfb539dfe38c7d7a0504be76c3ab9f7a02428095eac04289b172bc61d31"} Mar 20 15:45:21 crc kubenswrapper[4779]: I0320 15:45:21.066424 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s" Mar 20 15:45:21 crc kubenswrapper[4779]: I0320 15:45:21.066486 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" event={"ID":"629705fe-cbdc-4366-96ef-045d9d3c44dd","Type":"ContainerStarted","Data":"4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6"} Mar 20 15:45:21 crc kubenswrapper[4779]: I0320 15:45:21.220015 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" podStartSLOduration=6.219993053 podStartE2EDuration="6.219993053s" podCreationTimestamp="2026-03-20 15:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:21.094478011 +0000 UTC m=+1338.056993811" watchObservedRunningTime="2026-03-20 15:45:21.219993053 +0000 UTC m=+1338.182508853" Mar 20 15:45:22 crc kubenswrapper[4779]: I0320 15:45:22.082609 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerStarted","Data":"b7438c8703eed8876715f4b09a1a91fc80747e4a1d1da11173a1de34cd6894e0"} Mar 20 15:45:22 crc kubenswrapper[4779]: I0320 15:45:22.086829 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86f74cdcdf-tmrx6" event={"ID":"9752eda4-2b46-4251-bec7-5cbca2f6fd57","Type":"ContainerStarted","Data":"e1bd8b8b7b0f4aa151ce9453673b2139df9175d3b89e268569546efe778b1ca3"} Mar 20 15:45:22 crc kubenswrapper[4779]: I0320 15:45:22.086869 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:22 crc kubenswrapper[4779]: I0320 15:45:22.090426 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:22 crc kubenswrapper[4779]: I0320 15:45:22.117868 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-86f74cdcdf-tmrx6" podStartSLOduration=4.117835414 podStartE2EDuration="4.117835414s" podCreationTimestamp="2026-03-20 15:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:22.108292177 +0000 UTC m=+1339.070807977" watchObservedRunningTime="2026-03-20 15:45:22.117835414 +0000 UTC m=+1339.080351204" Mar 20 15:45:22 crc kubenswrapper[4779]: I0320 15:45:22.136972 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.136953769 podStartE2EDuration="26.136953769s" podCreationTimestamp="2026-03-20 15:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:22.132914597 +0000 UTC m=+1339.095430417" watchObservedRunningTime="2026-03-20 15:45:22.136953769 +0000 UTC m=+1339.099469569" Mar 20 15:45:22 crc kubenswrapper[4779]: I0320 15:45:22.161921 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.161902296 podStartE2EDuration="16.161902296s" podCreationTimestamp="2026-03-20 15:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:22.152814109 +0000 UTC m=+1339.115329909" watchObservedRunningTime="2026-03-20 15:45:22.161902296 +0000 UTC m=+1339.124418096" Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.111208 4779 generic.go:334] "Generic (PLEG): container finished" podID="85c5c01b-85a0-4714-81fe-31192c87b2fa" containerID="423c6d89080e00ce6b78d4e124a45d4b8c633d650eaea6de28b851a550f00412" exitCode=0 Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.111299 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mx4bn" event={"ID":"85c5c01b-85a0-4714-81fe-31192c87b2fa","Type":"ContainerDied","Data":"423c6d89080e00ce6b78d4e124a45d4b8c633d650eaea6de28b851a550f00412"} Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.113101 4779 generic.go:334] "Generic (PLEG): container finished" podID="e815c919-fbc1-4307-912e-13c74f996abb" containerID="4f7bd0a4b4334182e687bc8ec8bf581d58b9f2a63b8cd261ed653ee89e3f6ba3" exitCode=0 Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.113159 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rh9gx" event={"ID":"e815c919-fbc1-4307-912e-13c74f996abb","Type":"ContainerDied","Data":"4f7bd0a4b4334182e687bc8ec8bf581d58b9f2a63b8cd261ed653ee89e3f6ba3"} Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.116802 4779 generic.go:334] "Generic (PLEG): container finished" podID="8d89bfb9-3cdc-4e93-899a-361f5d5cf408" containerID="7a94844cceb07beea29a43faeb9b773344756eeb64c078c6cc8b1e72db3107e8" exitCode=0 Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.116877 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-4qv4b" event={"ID":"8d89bfb9-3cdc-4e93-899a-361f5d5cf408","Type":"ContainerDied","Data":"7a94844cceb07beea29a43faeb9b773344756eeb64c078c6cc8b1e72db3107e8"} Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.118686 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xjstn" event={"ID":"b67fdca5-13d1-4e83-8834-03856ca956b5","Type":"ContainerStarted","Data":"3816792d7f4a520eb194df5b57f19724f9da971e22f64b3a69c80ac94940dc7d"} Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.171525 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xjstn" podStartSLOduration=3.743142912 podStartE2EDuration="39.17150502s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="2026-03-20 15:44:47.480955591 +0000 UTC m=+1304.443471391" lastFinishedPulling="2026-03-20 15:45:22.909317699 +0000 UTC m=+1339.871833499" observedRunningTime="2026-03-20 15:45:24.165992144 +0000 UTC m=+1341.128507944" watchObservedRunningTime="2026-03-20 15:45:24.17150502 +0000 UTC m=+1341.134020810" Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.990703 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:45:24 crc kubenswrapper[4779]: I0320 15:45:24.990757 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:45:25 crc kubenswrapper[4779]: I0320 15:45:25.088126 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:45:25 crc kubenswrapper[4779]: I0320 15:45:25.089049 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:45:25 crc kubenswrapper[4779]: I0320 15:45:25.775735 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:25 crc kubenswrapper[4779]: I0320 15:45:25.846573 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-k84c5"] Mar 20 15:45:25 crc kubenswrapper[4779]: I0320 15:45:25.846797 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerName="dnsmasq-dns" containerID="cri-o://f3c0c28a55ee01257be1b799d96d4911fcf3ff1e695f3804bfbaa4980b47cb2d" gracePeriod=10 Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.147926 4779 generic.go:334] "Generic (PLEG): container finished" podID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerID="f3c0c28a55ee01257be1b799d96d4911fcf3ff1e695f3804bfbaa4980b47cb2d" exitCode=0 Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.147965 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" event={"ID":"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc","Type":"ContainerDied","Data":"f3c0c28a55ee01257be1b799d96d4911fcf3ff1e695f3804bfbaa4980b47cb2d"} Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.267087 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: connect: connection refused" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.842312 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.842681 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.842696 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.842707 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.884376 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.884424 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.922366 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.926878 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.936729 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:26 crc kubenswrapper[4779]: I0320 15:45:26.942734 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.167388 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.167435 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.601993 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mx4bn" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.610569 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.616288 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.649818 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-combined-ca-bundle\") pod \"85c5c01b-85a0-4714-81fe-31192c87b2fa\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.649890 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-scripts\") pod \"85c5c01b-85a0-4714-81fe-31192c87b2fa\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.649924 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-config-data\") pod \"85c5c01b-85a0-4714-81fe-31192c87b2fa\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.650017 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bkh8\" (UniqueName: \"kubernetes.io/projected/85c5c01b-85a0-4714-81fe-31192c87b2fa-kube-api-access-5bkh8\") pod \"85c5c01b-85a0-4714-81fe-31192c87b2fa\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.650145 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c5c01b-85a0-4714-81fe-31192c87b2fa-logs\") pod \"85c5c01b-85a0-4714-81fe-31192c87b2fa\" (UID: \"85c5c01b-85a0-4714-81fe-31192c87b2fa\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.650913 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c5c01b-85a0-4714-81fe-31192c87b2fa-logs" (OuterVolumeSpecName: "logs") pod "85c5c01b-85a0-4714-81fe-31192c87b2fa" (UID: "85c5c01b-85a0-4714-81fe-31192c87b2fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.676271 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c5c01b-85a0-4714-81fe-31192c87b2fa-kube-api-access-5bkh8" (OuterVolumeSpecName: "kube-api-access-5bkh8") pod "85c5c01b-85a0-4714-81fe-31192c87b2fa" (UID: "85c5c01b-85a0-4714-81fe-31192c87b2fa"). InnerVolumeSpecName "kube-api-access-5bkh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.678988 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-scripts" (OuterVolumeSpecName: "scripts") pod "85c5c01b-85a0-4714-81fe-31192c87b2fa" (UID: "85c5c01b-85a0-4714-81fe-31192c87b2fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.755638 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj8c6\" (UniqueName: \"kubernetes.io/projected/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-kube-api-access-vj8c6\") pod \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.755729 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-combined-ca-bundle\") pod \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.755795 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-scripts\") pod \"e815c919-fbc1-4307-912e-13c74f996abb\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.755833 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-config-data\") pod \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.755852 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-combined-ca-bundle\") pod \"e815c919-fbc1-4307-912e-13c74f996abb\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.755930 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-fernet-keys\") pod \"e815c919-fbc1-4307-912e-13c74f996abb\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.755977 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/e815c919-fbc1-4307-912e-13c74f996abb-kube-api-access-bt9pm\") pod \"e815c919-fbc1-4307-912e-13c74f996abb\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.756013 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-db-sync-config-data\") pod \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\" (UID: \"8d89bfb9-3cdc-4e93-899a-361f5d5cf408\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.756055 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-config-data\") pod \"e815c919-fbc1-4307-912e-13c74f996abb\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.756092 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-credential-keys\") pod \"e815c919-fbc1-4307-912e-13c74f996abb\" (UID: \"e815c919-fbc1-4307-912e-13c74f996abb\") " Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.756776 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.756803 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bkh8\" (UniqueName: \"kubernetes.io/projected/85c5c01b-85a0-4714-81fe-31192c87b2fa-kube-api-access-5bkh8\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.756817 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c5c01b-85a0-4714-81fe-31192c87b2fa-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.767401 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-kube-api-access-vj8c6" (OuterVolumeSpecName: "kube-api-access-vj8c6") pod "8d89bfb9-3cdc-4e93-899a-361f5d5cf408" (UID: "8d89bfb9-3cdc-4e93-899a-361f5d5cf408"). InnerVolumeSpecName "kube-api-access-vj8c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.769735 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e815c919-fbc1-4307-912e-13c74f996abb" (UID: "e815c919-fbc1-4307-912e-13c74f996abb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.769995 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e815c919-fbc1-4307-912e-13c74f996abb-kube-api-access-bt9pm" (OuterVolumeSpecName: "kube-api-access-bt9pm") pod "e815c919-fbc1-4307-912e-13c74f996abb" (UID: "e815c919-fbc1-4307-912e-13c74f996abb"). InnerVolumeSpecName "kube-api-access-bt9pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.781393 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-scripts" (OuterVolumeSpecName: "scripts") pod "e815c919-fbc1-4307-912e-13c74f996abb" (UID: "e815c919-fbc1-4307-912e-13c74f996abb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.792660 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e815c919-fbc1-4307-912e-13c74f996abb" (UID: "e815c919-fbc1-4307-912e-13c74f996abb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.817445 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8d89bfb9-3cdc-4e93-899a-361f5d5cf408" (UID: "8d89bfb9-3cdc-4e93-899a-361f5d5cf408"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.837324 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-config-data" (OuterVolumeSpecName: "config-data") pod "85c5c01b-85a0-4714-81fe-31192c87b2fa" (UID: "85c5c01b-85a0-4714-81fe-31192c87b2fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.851416 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e815c919-fbc1-4307-912e-13c74f996abb" (UID: "e815c919-fbc1-4307-912e-13c74f996abb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.867896 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.867924 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.867937 4779 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.867949 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/e815c919-fbc1-4307-912e-13c74f996abb-kube-api-access-bt9pm\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.867960 4779 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.867971 4779 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.867982 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj8c6\" (UniqueName: \"kubernetes.io/projected/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-kube-api-access-vj8c6\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.867995 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.907031 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85c5c01b-85a0-4714-81fe-31192c87b2fa" (UID: "85c5c01b-85a0-4714-81fe-31192c87b2fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.934542 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d89bfb9-3cdc-4e93-899a-361f5d5cf408" (UID: "8d89bfb9-3cdc-4e93-899a-361f5d5cf408"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.953384 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-config-data" (OuterVolumeSpecName: "config-data") pod "e815c919-fbc1-4307-912e-13c74f996abb" (UID: "e815c919-fbc1-4307-912e-13c74f996abb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.964313 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-config-data" (OuterVolumeSpecName: "config-data") pod "8d89bfb9-3cdc-4e93-899a-361f5d5cf408" (UID: "8d89bfb9-3cdc-4e93-899a-361f5d5cf408"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.969906 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c5c01b-85a0-4714-81fe-31192c87b2fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.969948 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.969961 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d89bfb9-3cdc-4e93-899a-361f5d5cf408-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:27 crc kubenswrapper[4779]: I0320 15:45:27.969972 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815c919-fbc1-4307-912e-13c74f996abb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.184323 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mx4bn" event={"ID":"85c5c01b-85a0-4714-81fe-31192c87b2fa","Type":"ContainerDied","Data":"4c2d2afadba931988575afbb166a9557a6715ebdbc7c9fc6327aef3cdaac6a3b"} Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.184379 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c2d2afadba931988575afbb166a9557a6715ebdbc7c9fc6327aef3cdaac6a3b" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.184341 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mx4bn" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.207369 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rh9gx" event={"ID":"e815c919-fbc1-4307-912e-13c74f996abb","Type":"ContainerDied","Data":"f83bd0651c19197fca12c4741ab388f4724d6e71d4d12f17267820301ac8f1a9"} Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.207407 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f83bd0651c19197fca12c4741ab388f4724d6e71d4d12f17267820301ac8f1a9" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.207383 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rh9gx" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.222261 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-4qv4b" event={"ID":"8d89bfb9-3cdc-4e93-899a-361f5d5cf408","Type":"ContainerDied","Data":"8ce7ce2b52de27f567f1db2d0a5ed4eb4bd230d2691261adb46f17dad603abfb"} Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.222305 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ce7ce2b52de27f567f1db2d0a5ed4eb4bd230d2691261adb46f17dad603abfb" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.222361 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-4qv4b" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.872663 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6878c9db6b-r5g2t"] Mar 20 15:45:28 crc kubenswrapper[4779]: E0320 15:45:28.873322 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e815c919-fbc1-4307-912e-13c74f996abb" containerName="keystone-bootstrap" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.873337 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e815c919-fbc1-4307-912e-13c74f996abb" containerName="keystone-bootstrap" Mar 20 15:45:28 crc kubenswrapper[4779]: E0320 15:45:28.873347 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a850ec0-7410-472b-b4da-19f990f2187f" containerName="collect-profiles" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.873353 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a850ec0-7410-472b-b4da-19f990f2187f" containerName="collect-profiles" Mar 20 15:45:28 crc kubenswrapper[4779]: E0320 15:45:28.873366 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c5c01b-85a0-4714-81fe-31192c87b2fa" containerName="placement-db-sync" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.873372 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c5c01b-85a0-4714-81fe-31192c87b2fa" containerName="placement-db-sync" Mar 20 15:45:28 crc kubenswrapper[4779]: E0320 15:45:28.873390 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d89bfb9-3cdc-4e93-899a-361f5d5cf408" containerName="watcher-db-sync" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.873397 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d89bfb9-3cdc-4e93-899a-361f5d5cf408" containerName="watcher-db-sync" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.873597 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e815c919-fbc1-4307-912e-13c74f996abb" containerName="keystone-bootstrap" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.873615 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c5c01b-85a0-4714-81fe-31192c87b2fa" containerName="placement-db-sync" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.873631 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d89bfb9-3cdc-4e93-899a-361f5d5cf408" containerName="watcher-db-sync" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.873646 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a850ec0-7410-472b-b4da-19f990f2187f" containerName="collect-profiles" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.874229 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.883051 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.883283 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.883477 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lrbdn" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.883505 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.883469 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.883660 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.899690 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6878c9db6b-r5g2t"] Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.980259 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d97b75b9b-swgzv"] Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.981635 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.989030 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-92cdc" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.989472 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.989673 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.989827 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 15:45:28 crc kubenswrapper[4779]: I0320 15:45:28.990574 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.000002 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d97b75b9b-swgzv"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.021356 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-scripts\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.021393 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-public-tls-certs\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.021438 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-internal-tls-certs\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.021484 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-config-data\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.021574 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-credential-keys\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.021624 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-fernet-keys\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.021644 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxms\" (UniqueName: \"kubernetes.io/projected/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-kube-api-access-ddxms\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.021673 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-combined-ca-bundle\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.093349 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.094546 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.099314 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-nzgjd" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.099497 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.126914 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-internal-tls-certs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.126956 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-public-tls-certs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.126999 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-config-data\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.127045 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-credential-keys\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128245 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d44457-fd0e-462a-b293-65eec2238d7b-logs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128326 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-fernet-keys\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128356 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-combined-ca-bundle\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128394 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxms\" (UniqueName: \"kubernetes.io/projected/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-kube-api-access-ddxms\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128449 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-combined-ca-bundle\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128527 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-scripts\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128591 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-scripts\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128614 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-public-tls-certs\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128681 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6fb8\" (UniqueName: \"kubernetes.io/projected/83d44457-fd0e-462a-b293-65eec2238d7b-kube-api-access-d6fb8\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128711 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-internal-tls-certs\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.128773 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-config-data\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.140965 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-credential-keys\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.142638 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-public-tls-certs\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.143012 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-config-data\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.150992 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-fernet-keys\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.156534 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-combined-ca-bundle\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.163477 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-scripts\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.170402 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-internal-tls-certs\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.192335 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.203367 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxms\" (UniqueName: \"kubernetes.io/projected/20f41ef6-17fa-4c90-a902-d2f5efdf45f5-kube-api-access-ddxms\") pod \"keystone-6878c9db6b-r5g2t\" (UID: \"20f41ef6-17fa-4c90-a902-d2f5efdf45f5\") " pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.209851 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.216954 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230172 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-combined-ca-bundle\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230264 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-scripts\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230315 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d25ade-4950-40ca-8481-f52c063ff998-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230361 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6fb8\" (UniqueName: \"kubernetes.io/projected/83d44457-fd0e-462a-b293-65eec2238d7b-kube-api-access-d6fb8\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230403 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3d25ade-4950-40ca-8481-f52c063ff998-logs\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230443 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqrg7\" (UniqueName: \"kubernetes.io/projected/b3d25ade-4950-40ca-8481-f52c063ff998-kube-api-access-kqrg7\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230483 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-internal-tls-certs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230515 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-public-tls-certs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230571 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-config-data\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230612 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d25ade-4950-40ca-8481-f52c063ff998-config-data\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.230649 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d44457-fd0e-462a-b293-65eec2238d7b-logs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.231091 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d44457-fd0e-462a-b293-65eec2238d7b-logs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.236905 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-scripts\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.238388 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.238419 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.242514 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-combined-ca-bundle\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.242998 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-public-tls-certs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.252949 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-config-data\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.258166 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.278597 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-internal-tls-certs\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.289938 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6fb8\" (UniqueName: \"kubernetes.io/projected/83d44457-fd0e-462a-b293-65eec2238d7b-kube-api-access-d6fb8\") pod \"placement-6d97b75b9b-swgzv\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.311186 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.311629 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332243 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3d25ade-4950-40ca-8481-f52c063ff998-logs\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332335 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqrg7\" (UniqueName: \"kubernetes.io/projected/b3d25ade-4950-40ca-8481-f52c063ff998-kube-api-access-kqrg7\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332381 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-config-data\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332468 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4sv8\" (UniqueName: \"kubernetes.io/projected/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-kube-api-access-v4sv8\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332488 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d25ade-4950-40ca-8481-f52c063ff998-config-data\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332529 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332573 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332609 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-logs\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.332640 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d25ade-4950-40ca-8481-f52c063ff998-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.333870 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3d25ade-4950-40ca-8481-f52c063ff998-logs\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.338879 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d25ade-4950-40ca-8481-f52c063ff998-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.340702 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d25ade-4950-40ca-8481-f52c063ff998-config-data\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.368901 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqrg7\" (UniqueName: \"kubernetes.io/projected/b3d25ade-4950-40ca-8481-f52c063ff998-kube-api-access-kqrg7\") pod \"watcher-applier-0\" (UID: \"b3d25ade-4950-40ca-8481-f52c063ff998\") " pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.382120 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.384711 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.388632 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.426563 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.434394 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.434475 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-logs\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.434583 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-config-data\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.434669 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sv8\" (UniqueName: \"kubernetes.io/projected/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-kube-api-access-v4sv8\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.434717 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.437300 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-logs\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.439926 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.440593 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.440990 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-config-data\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.444741 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.462347 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sv8\" (UniqueName: \"kubernetes.io/projected/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-kube-api-access-v4sv8\") pod \"watcher-api-0\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.500657 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.501704 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c5d7688b4-46tq6"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.501927 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c5d7688b4-46tq6" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-api" containerID="cri-o://70c09986161a5c03082e9fbc1128733b54c8ac13ec5de237bf517a1b5110f3a4" gracePeriod=30 Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.502379 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c5d7688b4-46tq6" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-httpd" containerID="cri-o://42c82203c403e6671f0875f5254ea8223ba8f882f95bbb3e6395e2190134b8dc" gracePeriod=30 Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.520569 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7c5d7688b4-46tq6" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.168:9696/\": EOF" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.525972 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-854c6767b7-nlddt"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.527537 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.538744 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dll6k\" (UniqueName: \"kubernetes.io/projected/8b98145a-1310-4f32-9b9e-5824c7b793ef-kube-api-access-dll6k\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.538792 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.538901 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.538918 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b98145a-1310-4f32-9b9e-5824c7b793ef-logs\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.538974 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.548882 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-854c6767b7-nlddt"] Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.640842 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.640891 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b98145a-1310-4f32-9b9e-5824c7b793ef-logs\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641309 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-combined-ca-bundle\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641368 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-config\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641417 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641487 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-ovndb-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641539 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5ptc\" (UniqueName: \"kubernetes.io/projected/b8707035-1583-426d-aa57-b5a23e7403b3-kube-api-access-t5ptc\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641569 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-internal-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641625 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-httpd-config\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641662 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b98145a-1310-4f32-9b9e-5824c7b793ef-logs\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641683 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dll6k\" (UniqueName: \"kubernetes.io/projected/8b98145a-1310-4f32-9b9e-5824c7b793ef-kube-api-access-dll6k\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641782 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.641870 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-public-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.645611 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.645679 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.646777 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.662770 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dll6k\" (UniqueName: \"kubernetes.io/projected/8b98145a-1310-4f32-9b9e-5824c7b793ef-kube-api-access-dll6k\") pod \"watcher-decision-engine-0\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.700626 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.717484 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.743494 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-public-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.743605 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-combined-ca-bundle\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.743642 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-config\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.743718 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-ovndb-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.743754 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5ptc\" (UniqueName: \"kubernetes.io/projected/b8707035-1583-426d-aa57-b5a23e7403b3-kube-api-access-t5ptc\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.743788 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-internal-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.743829 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-httpd-config\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.747867 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-config\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.747905 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-httpd-config\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.749932 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-ovndb-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.752950 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-public-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.759362 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-combined-ca-bundle\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.763703 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8707035-1583-426d-aa57-b5a23e7403b3-internal-tls-certs\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.782452 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5ptc\" (UniqueName: \"kubernetes.io/projected/b8707035-1583-426d-aa57-b5a23e7403b3-kube-api-access-t5ptc\") pod \"neutron-854c6767b7-nlddt\" (UID: \"b8707035-1583-426d-aa57-b5a23e7403b3\") " pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:29 crc kubenswrapper[4779]: I0320 15:45:29.881494 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:30 crc kubenswrapper[4779]: I0320 15:45:30.883738 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 15:45:30 crc kubenswrapper[4779]: I0320 15:45:30.884246 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:45:30 crc kubenswrapper[4779]: I0320 15:45:30.950079 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:30 crc kubenswrapper[4779]: I0320 15:45:30.950216 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.060606 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.095185 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.167115 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c69fcdb9d-ntwph"] Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.168651 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.201707 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c69fcdb9d-ntwph"] Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.309386 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-combined-ca-bundle\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.309746 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4w6q\" (UniqueName: \"kubernetes.io/projected/a64425ab-a231-4dd1-818e-156d70a0864d-kube-api-access-k4w6q\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.309795 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-config-data\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.309816 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-public-tls-certs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.309867 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a64425ab-a231-4dd1-818e-156d70a0864d-logs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.309927 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-internal-tls-certs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.310863 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-scripts\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.412547 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a64425ab-a231-4dd1-818e-156d70a0864d-logs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.412627 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-internal-tls-certs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.412672 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-scripts\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.412845 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-combined-ca-bundle\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.412871 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4w6q\" (UniqueName: \"kubernetes.io/projected/a64425ab-a231-4dd1-818e-156d70a0864d-kube-api-access-k4w6q\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.412897 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-config-data\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.412916 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-public-tls-certs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.416764 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a64425ab-a231-4dd1-818e-156d70a0864d-logs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.419564 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-scripts\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.419946 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-combined-ca-bundle\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.420128 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-config-data\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.428711 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-internal-tls-certs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.443881 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64425ab-a231-4dd1-818e-156d70a0864d-public-tls-certs\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.446843 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4w6q\" (UniqueName: \"kubernetes.io/projected/a64425ab-a231-4dd1-818e-156d70a0864d-kube-api-access-k4w6q\") pod \"placement-c69fcdb9d-ntwph\" (UID: \"a64425ab-a231-4dd1-818e-156d70a0864d\") " pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:31 crc kubenswrapper[4779]: I0320 15:45:31.499969 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:32 crc kubenswrapper[4779]: I0320 15:45:32.275713 4779 generic.go:334] "Generic (PLEG): container finished" podID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerID="70c09986161a5c03082e9fbc1128733b54c8ac13ec5de237bf517a1b5110f3a4" exitCode=0 Mar 20 15:45:32 crc kubenswrapper[4779]: I0320 15:45:32.275987 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5d7688b4-46tq6" event={"ID":"1f3be8d3-51e7-4a9b-9409-9aa999876aa3","Type":"ContainerDied","Data":"70c09986161a5c03082e9fbc1128733b54c8ac13ec5de237bf517a1b5110f3a4"} Mar 20 15:45:33 crc kubenswrapper[4779]: I0320 15:45:33.286606 4779 generic.go:334] "Generic (PLEG): container finished" podID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerID="42c82203c403e6671f0875f5254ea8223ba8f882f95bbb3e6395e2190134b8dc" exitCode=0 Mar 20 15:45:33 crc kubenswrapper[4779]: I0320 15:45:33.286650 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5d7688b4-46tq6" event={"ID":"1f3be8d3-51e7-4a9b-9409-9aa999876aa3","Type":"ContainerDied","Data":"42c82203c403e6671f0875f5254ea8223ba8f882f95bbb3e6395e2190134b8dc"} Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.555467 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.676567 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-swift-storage-0\") pod \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.677406 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-config\") pod \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.677783 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l52j\" (UniqueName: \"kubernetes.io/projected/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-kube-api-access-8l52j\") pod \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.677829 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-svc\") pod \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.677931 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-sb\") pod \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.678010 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-nb\") pod \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\" (UID: \"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc\") " Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.694074 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-kube-api-access-8l52j" (OuterVolumeSpecName: "kube-api-access-8l52j") pod "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" (UID: "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc"). InnerVolumeSpecName "kube-api-access-8l52j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.740050 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" (UID: "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.743824 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" (UID: "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.750490 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-config" (OuterVolumeSpecName: "config") pod "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" (UID: "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.764839 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" (UID: "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.781071 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l52j\" (UniqueName: \"kubernetes.io/projected/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-kube-api-access-8l52j\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.781102 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.781129 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.781138 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.781147 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.823705 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" (UID: "d73c6bfb-5dbf-4f62-8214-0b39d3558cbc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.883308 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.943718 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:45:34 crc kubenswrapper[4779]: I0320 15:45:34.992230 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.093925 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-dc78779bd-shkrw" podUID="dddea4aa-aba6-49f1-a8bc-6ce9850da26d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.138298 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d97b75b9b-swgzv"] Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.158888 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6878c9db6b-r5g2t"] Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.171445 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.268280 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-854c6767b7-nlddt"] Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.285646 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.305804 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" event={"ID":"d73c6bfb-5dbf-4f62-8214-0b39d3558cbc","Type":"ContainerDied","Data":"8be4690c0c877c1f0990e88ddae184c36e65d429e152b5c1d354c0dd4d2a1812"} Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.305884 4779 scope.go:117] "RemoveContainer" containerID="f3c0c28a55ee01257be1b799d96d4911fcf3ff1e695f3804bfbaa4980b47cb2d" Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.306061 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.362789 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-k84c5"] Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.372204 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-k84c5"] Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.403799 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c69fcdb9d-ntwph"] Mar 20 15:45:35 crc kubenswrapper[4779]: W0320 15:45:35.446728 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b98145a_1310_4f32_9b9e_5824c7b793ef.slice/crio-cd1a221d90db07970fa8743c8d44fe3a502da5828454bbc33dc75b91d1ae5370 WatchSource:0}: Error finding container cd1a221d90db07970fa8743c8d44fe3a502da5828454bbc33dc75b91d1ae5370: Status 404 returned error can't find the container with id cd1a221d90db07970fa8743c8d44fe3a502da5828454bbc33dc75b91d1ae5370 Mar 20 15:45:35 crc kubenswrapper[4779]: W0320 15:45:35.450471 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f41ef6_17fa_4c90_a902_d2f5efdf45f5.slice/crio-b6d749e5b52d6b8001e17f927b0cab7084bc75d03ab00deed34316a52f433b82 WatchSource:0}: Error finding container b6d749e5b52d6b8001e17f927b0cab7084bc75d03ab00deed34316a52f433b82: Status 404 returned error can't find the container with id b6d749e5b52d6b8001e17f927b0cab7084bc75d03ab00deed34316a52f433b82 Mar 20 15:45:35 crc kubenswrapper[4779]: W0320 15:45:35.455829 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d25ade_4950_40ca_8481_f52c063ff998.slice/crio-49ba6195671495b85606f93be7922a565766db295223502e6dac8167f3ce8b41 WatchSource:0}: Error finding container 49ba6195671495b85606f93be7922a565766db295223502e6dac8167f3ce8b41: Status 404 returned error can't find the container with id 49ba6195671495b85606f93be7922a565766db295223502e6dac8167f3ce8b41 Mar 20 15:45:35 crc kubenswrapper[4779]: W0320 15:45:35.467324 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84bd8c95_b533_46ea_b1e2_3a53ddaec70c.slice/crio-7179437bb61661cb2e564cf0023b826b5822280cca16a6261fe6b67d6a87a83e WatchSource:0}: Error finding container 7179437bb61661cb2e564cf0023b826b5822280cca16a6261fe6b67d6a87a83e: Status 404 returned error can't find the container with id 7179437bb61661cb2e564cf0023b826b5822280cca16a6261fe6b67d6a87a83e Mar 20 15:45:35 crc kubenswrapper[4779]: W0320 15:45:35.471312 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64425ab_a231_4dd1_818e_156d70a0864d.slice/crio-557b689076796785fc3101892099cbe45c4fa68ca7d231d1523459552d64b151 WatchSource:0}: Error finding container 557b689076796785fc3101892099cbe45c4fa68ca7d231d1523459552d64b151: Status 404 returned error can't find the container with id 557b689076796785fc3101892099cbe45c4fa68ca7d231d1523459552d64b151 Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.534284 4779 scope.go:117] "RemoveContainer" containerID="30ae351dc4aeb444ed855167bedf714620624ff370ae1485d118ea8aca01b668" Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.777626 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.825083 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" path="/var/lib/kubelet/pods/d73c6bfb-5dbf-4f62-8214-0b39d3558cbc/volumes" Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.904656 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-combined-ca-bundle\") pod \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.904747 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-httpd-config\") pod \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.904857 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrd2q\" (UniqueName: \"kubernetes.io/projected/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-kube-api-access-hrd2q\") pod \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.904891 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-config\") pod \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.904927 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-ovndb-tls-certs\") pod \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\" (UID: \"1f3be8d3-51e7-4a9b-9409-9aa999876aa3\") " Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.909518 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-kube-api-access-hrd2q" (OuterVolumeSpecName: "kube-api-access-hrd2q") pod "1f3be8d3-51e7-4a9b-9409-9aa999876aa3" (UID: "1f3be8d3-51e7-4a9b-9409-9aa999876aa3"). InnerVolumeSpecName "kube-api-access-hrd2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:35 crc kubenswrapper[4779]: I0320 15:45:35.917266 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1f3be8d3-51e7-4a9b-9409-9aa999876aa3" (UID: "1f3be8d3-51e7-4a9b-9409-9aa999876aa3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.010071 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrd2q\" (UniqueName: \"kubernetes.io/projected/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-kube-api-access-hrd2q\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.010141 4779 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.265485 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-k84c5" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: i/o timeout" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.268924 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-config" (OuterVolumeSpecName: "config") pod "1f3be8d3-51e7-4a9b-9409-9aa999876aa3" (UID: "1f3be8d3-51e7-4a9b-9409-9aa999876aa3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.275240 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1f3be8d3-51e7-4a9b-9409-9aa999876aa3" (UID: "1f3be8d3-51e7-4a9b-9409-9aa999876aa3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.284270 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f3be8d3-51e7-4a9b-9409-9aa999876aa3" (UID: "1f3be8d3-51e7-4a9b-9409-9aa999876aa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.356691 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.356773 4779 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.356801 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3be8d3-51e7-4a9b-9409-9aa999876aa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.376625 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5d7688b4-46tq6" event={"ID":"1f3be8d3-51e7-4a9b-9409-9aa999876aa3","Type":"ContainerDied","Data":"18dbd068df7524fb7305ca8210419fa7572d23eff517a83d98fb524d99a692e2"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.376677 4779 scope.go:117] "RemoveContainer" containerID="42c82203c403e6671f0875f5254ea8223ba8f882f95bbb3e6395e2190134b8dc" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.376798 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5d7688b4-46tq6" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.406278 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8b98145a-1310-4f32-9b9e-5824c7b793ef","Type":"ContainerStarted","Data":"cd1a221d90db07970fa8743c8d44fe3a502da5828454bbc33dc75b91d1ae5370"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.420675 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d97b75b9b-swgzv" event={"ID":"83d44457-fd0e-462a-b293-65eec2238d7b","Type":"ContainerStarted","Data":"9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.420722 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d97b75b9b-swgzv" event={"ID":"83d44457-fd0e-462a-b293-65eec2238d7b","Type":"ContainerStarted","Data":"d35173b68721f0b66623fe0581f162d2c79a6bd02df601e086ffb3d100373535"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.442159 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerStarted","Data":"3a416b6e3b56fa65fe127241ccfe0961d7b327d2cea7fac1af288fdec90ddba0"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.444176 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84bd8c95-b533-46ea-b1e2-3a53ddaec70c","Type":"ContainerStarted","Data":"9b3010172e23c0e2fb22547ed52cdeebe05c5bb0ddf175c2f0118ece00124f45"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.444230 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84bd8c95-b533-46ea-b1e2-3a53ddaec70c","Type":"ContainerStarted","Data":"7179437bb61661cb2e564cf0023b826b5822280cca16a6261fe6b67d6a87a83e"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.445935 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c5d7688b4-46tq6"] Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.447303 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6878c9db6b-r5g2t" event={"ID":"20f41ef6-17fa-4c90-a902-d2f5efdf45f5","Type":"ContainerStarted","Data":"2a94456390e979953c8af11f69e0f392bd33b587b74e1829f49d9551db19c564"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.447330 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6878c9db6b-r5g2t" event={"ID":"20f41ef6-17fa-4c90-a902-d2f5efdf45f5","Type":"ContainerStarted","Data":"b6d749e5b52d6b8001e17f927b0cab7084bc75d03ab00deed34316a52f433b82"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.448595 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.454378 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c5d7688b4-46tq6"] Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.456448 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c69fcdb9d-ntwph" event={"ID":"a64425ab-a231-4dd1-818e-156d70a0864d","Type":"ContainerStarted","Data":"0435a6be38feb6c492eda9427d0ad5c129919b748154cc3863615c80b6033940"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.456507 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c69fcdb9d-ntwph" event={"ID":"a64425ab-a231-4dd1-818e-156d70a0864d","Type":"ContainerStarted","Data":"557b689076796785fc3101892099cbe45c4fa68ca7d231d1523459552d64b151"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.460773 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-854c6767b7-nlddt" event={"ID":"b8707035-1583-426d-aa57-b5a23e7403b3","Type":"ContainerStarted","Data":"247268ab3c2a76b315db2fe46ddbb0413b29f837fd64bd50e85d8c3d2245380e"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.460826 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-854c6767b7-nlddt" event={"ID":"b8707035-1583-426d-aa57-b5a23e7403b3","Type":"ContainerStarted","Data":"984042b0b33fd90abba4adee463187bdcd8147c0cf92082aa4d08c3b4bcd1f4a"} Mar 20 15:45:36 crc kubenswrapper[4779]: I0320 15:45:36.468204 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b3d25ade-4950-40ca-8481-f52c063ff998","Type":"ContainerStarted","Data":"49ba6195671495b85606f93be7922a565766db295223502e6dac8167f3ce8b41"} Mar 20 15:45:37 crc kubenswrapper[4779]: I0320 15:45:37.177250 4779 scope.go:117] "RemoveContainer" containerID="70c09986161a5c03082e9fbc1128733b54c8ac13ec5de237bf517a1b5110f3a4" Mar 20 15:45:37 crc kubenswrapper[4779]: I0320 15:45:37.481468 4779 generic.go:334] "Generic (PLEG): container finished" podID="b67fdca5-13d1-4e83-8834-03856ca956b5" containerID="3816792d7f4a520eb194df5b57f19724f9da971e22f64b3a69c80ac94940dc7d" exitCode=0 Mar 20 15:45:37 crc kubenswrapper[4779]: I0320 15:45:37.481538 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xjstn" event={"ID":"b67fdca5-13d1-4e83-8834-03856ca956b5","Type":"ContainerDied","Data":"3816792d7f4a520eb194df5b57f19724f9da971e22f64b3a69c80ac94940dc7d"} Mar 20 15:45:37 crc kubenswrapper[4779]: I0320 15:45:37.486862 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wjcm6" event={"ID":"130a37da-c17e-48dd-8712-f87c67f01852","Type":"ContainerStarted","Data":"5e89d3a4cbd0626117310c3f7cacc52a9b5a84cc4f4a764f17de36df44024385"} Mar 20 15:45:37 crc kubenswrapper[4779]: I0320 15:45:37.502812 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6878c9db6b-r5g2t" podStartSLOduration=9.502794032 podStartE2EDuration="9.502794032s" podCreationTimestamp="2026-03-20 15:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:36.473442263 +0000 UTC m=+1353.435958083" watchObservedRunningTime="2026-03-20 15:45:37.502794032 +0000 UTC m=+1354.465309832" Mar 20 15:45:37 crc kubenswrapper[4779]: I0320 15:45:37.539212 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wjcm6" podStartSLOduration=3.661297722 podStartE2EDuration="52.539191479s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="2026-03-20 15:44:46.657576852 +0000 UTC m=+1303.620092652" lastFinishedPulling="2026-03-20 15:45:35.535470609 +0000 UTC m=+1352.497986409" observedRunningTime="2026-03-20 15:45:37.527399722 +0000 UTC m=+1354.489915532" watchObservedRunningTime="2026-03-20 15:45:37.539191479 +0000 UTC m=+1354.501707289" Mar 20 15:45:37 crc kubenswrapper[4779]: I0320 15:45:37.821552 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" path="/var/lib/kubelet/pods/1f3be8d3-51e7-4a9b-9409-9aa999876aa3/volumes" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.498544 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b3d25ade-4950-40ca-8481-f52c063ff998","Type":"ContainerStarted","Data":"7608d40d559c3bf9352e528d3b64f135523b4fa615b6eb6a1c80f9e145eeea66"} Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.508032 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d97b75b9b-swgzv" event={"ID":"83d44457-fd0e-462a-b293-65eec2238d7b","Type":"ContainerStarted","Data":"0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c"} Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.509086 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.509146 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.513973 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84bd8c95-b533-46ea-b1e2-3a53ddaec70c","Type":"ContainerStarted","Data":"256c73fa67ac5fd0452c6071ac8471ab9f080a3ab4ea4409a79d0c323ee82509"} Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.515462 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.517951 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c69fcdb9d-ntwph" event={"ID":"a64425ab-a231-4dd1-818e-156d70a0864d","Type":"ContainerStarted","Data":"da5771c4a50d6dd32e4d1a364630dc63cac8e99464e8217ff82eb0ee097b5f09"} Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.518741 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.518784 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.520791 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-854c6767b7-nlddt" event={"ID":"b8707035-1583-426d-aa57-b5a23e7403b3","Type":"ContainerStarted","Data":"5dca97a23415d2816e0aa49647e4cdb128dffe1bd801af148a4c8a4a0bd9eba6"} Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.521400 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.525326 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8b98145a-1310-4f32-9b9e-5824c7b793ef","Type":"ContainerStarted","Data":"8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493"} Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.540592 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=7.693528373 podStartE2EDuration="9.540569773s" podCreationTimestamp="2026-03-20 15:45:29 +0000 UTC" firstStartedPulling="2026-03-20 15:45:35.45803299 +0000 UTC m=+1352.420548790" lastFinishedPulling="2026-03-20 15:45:37.30507439 +0000 UTC m=+1354.267590190" observedRunningTime="2026-03-20 15:45:38.531809304 +0000 UTC m=+1355.494325104" watchObservedRunningTime="2026-03-20 15:45:38.540569773 +0000 UTC m=+1355.503085603" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.575306 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-854c6767b7-nlddt" podStartSLOduration=9.575283142 podStartE2EDuration="9.575283142s" podCreationTimestamp="2026-03-20 15:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:38.563666019 +0000 UTC m=+1355.526181819" watchObservedRunningTime="2026-03-20 15:45:38.575283142 +0000 UTC m=+1355.537798942" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.599665 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d97b75b9b-swgzv" podStartSLOduration=10.599644496 podStartE2EDuration="10.599644496s" podCreationTimestamp="2026-03-20 15:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:38.587271085 +0000 UTC m=+1355.549786895" watchObservedRunningTime="2026-03-20 15:45:38.599644496 +0000 UTC m=+1355.562160296" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.620514 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=7.774604326 podStartE2EDuration="9.620496069s" podCreationTimestamp="2026-03-20 15:45:29 +0000 UTC" firstStartedPulling="2026-03-20 15:45:35.45538291 +0000 UTC m=+1352.417898710" lastFinishedPulling="2026-03-20 15:45:37.301274653 +0000 UTC m=+1354.263790453" observedRunningTime="2026-03-20 15:45:38.619938887 +0000 UTC m=+1355.582454687" watchObservedRunningTime="2026-03-20 15:45:38.620496069 +0000 UTC m=+1355.583011869" Mar 20 15:45:38 crc kubenswrapper[4779]: I0320 15:45:38.647246 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=9.647221667 podStartE2EDuration="9.647221667s" podCreationTimestamp="2026-03-20 15:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:38.637305661 +0000 UTC m=+1355.599821461" watchObservedRunningTime="2026-03-20 15:45:38.647221667 +0000 UTC m=+1355.609737477" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.030829 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xjstn" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.049809 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c69fcdb9d-ntwph" podStartSLOduration=8.049789894 podStartE2EDuration="8.049789894s" podCreationTimestamp="2026-03-20 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:38.669641277 +0000 UTC m=+1355.632157087" watchObservedRunningTime="2026-03-20 15:45:39.049789894 +0000 UTC m=+1356.012305694" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.220386 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-combined-ca-bundle\") pod \"b67fdca5-13d1-4e83-8834-03856ca956b5\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.220618 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-db-sync-config-data\") pod \"b67fdca5-13d1-4e83-8834-03856ca956b5\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.220842 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbnsv\" (UniqueName: \"kubernetes.io/projected/b67fdca5-13d1-4e83-8834-03856ca956b5-kube-api-access-gbnsv\") pod \"b67fdca5-13d1-4e83-8834-03856ca956b5\" (UID: \"b67fdca5-13d1-4e83-8834-03856ca956b5\") " Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.231008 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b67fdca5-13d1-4e83-8834-03856ca956b5" (UID: "b67fdca5-13d1-4e83-8834-03856ca956b5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.251550 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b67fdca5-13d1-4e83-8834-03856ca956b5-kube-api-access-gbnsv" (OuterVolumeSpecName: "kube-api-access-gbnsv") pod "b67fdca5-13d1-4e83-8834-03856ca956b5" (UID: "b67fdca5-13d1-4e83-8834-03856ca956b5"). InnerVolumeSpecName "kube-api-access-gbnsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.320250 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b67fdca5-13d1-4e83-8834-03856ca956b5" (UID: "b67fdca5-13d1-4e83-8834-03856ca956b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.323053 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.323076 4779 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b67fdca5-13d1-4e83-8834-03856ca956b5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.323088 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbnsv\" (UniqueName: \"kubernetes.io/projected/b67fdca5-13d1-4e83-8834-03856ca956b5-kube-api-access-gbnsv\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.443373 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.443423 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.484513 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.552140 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xjstn" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.552183 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xjstn" event={"ID":"b67fdca5-13d1-4e83-8834-03856ca956b5","Type":"ContainerDied","Data":"d3c9384254e248e60dfe47156d02895d622f7314c32a7c5a76225cca1d8419c6"} Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.552224 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3c9384254e248e60dfe47156d02895d622f7314c32a7c5a76225cca1d8419c6" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.593640 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.701585 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.702048 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.718379 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.837711 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-57f6557d6c-cps7z"] Mar 20 15:45:39 crc kubenswrapper[4779]: E0320 15:45:39.838401 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerName="dnsmasq-dns" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.838625 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerName="dnsmasq-dns" Mar 20 15:45:39 crc kubenswrapper[4779]: E0320 15:45:39.838729 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67fdca5-13d1-4e83-8834-03856ca956b5" containerName="barbican-db-sync" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.838817 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67fdca5-13d1-4e83-8834-03856ca956b5" containerName="barbican-db-sync" Mar 20 15:45:39 crc kubenswrapper[4779]: E0320 15:45:39.838910 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerName="init" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.838974 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerName="init" Mar 20 15:45:39 crc kubenswrapper[4779]: E0320 15:45:39.839059 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-httpd" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.839150 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-httpd" Mar 20 15:45:39 crc kubenswrapper[4779]: E0320 15:45:39.839262 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-api" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.839338 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-api" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.839603 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73c6bfb-5dbf-4f62-8214-0b39d3558cbc" containerName="dnsmasq-dns" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.839687 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b67fdca5-13d1-4e83-8834-03856ca956b5" containerName="barbican-db-sync" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.839776 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-api" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.839863 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3be8d3-51e7-4a9b-9409-9aa999876aa3" containerName="neutron-httpd" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.841357 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.848638 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.848836 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-s67ld" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.855543 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-557b5d954b-f5w4x"] Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.867617 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.872210 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.883413 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.910202 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57f6557d6c-cps7z"] Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.912978 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.933525 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-combined-ca-bundle\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.933953 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d481333-1c1f-49eb-a304-cfb5861d2cf4-logs\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.934228 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmt2m\" (UniqueName: \"kubernetes.io/projected/0d481333-1c1f-49eb-a304-cfb5861d2cf4-kube-api-access-hmt2m\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.934424 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-config-data\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.934584 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-config-data-custom\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:39 crc kubenswrapper[4779]: I0320 15:45:39.957920 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-557b5d954b-f5w4x"] Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.021153 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-lg5dh"] Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.022796 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042187 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-config-data\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042243 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-config-data\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042278 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mst6\" (UniqueName: \"kubernetes.io/projected/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-kube-api-access-6mst6\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042302 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-config-data-custom\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042342 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-combined-ca-bundle\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042369 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-combined-ca-bundle\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042400 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d481333-1c1f-49eb-a304-cfb5861d2cf4-logs\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042417 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-logs\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042441 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-config-data-custom\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.042506 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmt2m\" (UniqueName: \"kubernetes.io/projected/0d481333-1c1f-49eb-a304-cfb5861d2cf4-kube-api-access-hmt2m\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.043393 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d481333-1c1f-49eb-a304-cfb5861d2cf4-logs\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.059289 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-config-data\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.078191 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-lg5dh"] Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.079174 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-combined-ca-bundle\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.079388 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d481333-1c1f-49eb-a304-cfb5861d2cf4-config-data-custom\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.086381 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmt2m\" (UniqueName: \"kubernetes.io/projected/0d481333-1c1f-49eb-a304-cfb5861d2cf4-kube-api-access-hmt2m\") pod \"barbican-worker-57f6557d6c-cps7z\" (UID: \"0d481333-1c1f-49eb-a304-cfb5861d2cf4\") " pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.122159 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b659b97bb-hbsns"] Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.132736 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.136128 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144587 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-config-data\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144631 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144673 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mst6\" (UniqueName: \"kubernetes.io/projected/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-kube-api-access-6mst6\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144695 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144738 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-combined-ca-bundle\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144759 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144783 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrzf\" (UniqueName: \"kubernetes.io/projected/aafd9b1e-a336-4bc8-a389-512294752f12-kube-api-access-qwrzf\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144808 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-logs\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144832 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-config-data-custom\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144850 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.144880 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-config\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.149513 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-logs\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.160433 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-config-data\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.169245 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b659b97bb-hbsns"] Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.172736 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-config-data-custom\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.173391 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-combined-ca-bundle\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.192491 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mst6\" (UniqueName: \"kubernetes.io/projected/ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294-kube-api-access-6mst6\") pod \"barbican-keystone-listener-557b5d954b-f5w4x\" (UID: \"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294\") " pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.216540 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57f6557d6c-cps7z" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.247357 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.248207 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aae8b0d-a776-48ba-beae-55f8d81a4074-logs\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.248435 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.250261 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.250538 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.251993 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrzf\" (UniqueName: \"kubernetes.io/projected/aafd9b1e-a336-4bc8-a389-512294752f12-kube-api-access-qwrzf\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.252071 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-combined-ca-bundle\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.252159 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data-custom\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.252235 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.252306 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68b9\" (UniqueName: \"kubernetes.io/projected/0aae8b0d-a776-48ba-beae-55f8d81a4074-kube-api-access-b68b9\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.252336 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-config\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.252540 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.254470 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.255771 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.257754 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-config\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.258781 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.285441 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.291096 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrzf\" (UniqueName: \"kubernetes.io/projected/aafd9b1e-a336-4bc8-a389-512294752f12-kube-api-access-qwrzf\") pod \"dnsmasq-dns-75c8ddd69c-lg5dh\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.354784 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aae8b0d-a776-48ba-beae-55f8d81a4074-logs\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.354872 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.354996 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-combined-ca-bundle\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.355033 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data-custom\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.355098 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68b9\" (UniqueName: \"kubernetes.io/projected/0aae8b0d-a776-48ba-beae-55f8d81a4074-kube-api-access-b68b9\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.356271 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aae8b0d-a776-48ba-beae-55f8d81a4074-logs\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.363983 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.368574 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data-custom\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.375793 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-combined-ca-bundle\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.379754 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68b9\" (UniqueName: \"kubernetes.io/projected/0aae8b0d-a776-48ba-beae-55f8d81a4074-kube-api-access-b68b9\") pod \"barbican-api-5b659b97bb-hbsns\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.569715 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.571866 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.572868 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.598395 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.636929 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57f6557d6c-cps7z"] Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.667661 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 20 15:45:40 crc kubenswrapper[4779]: W0320 15:45:40.675773 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d481333_1c1f_49eb_a304_cfb5861d2cf4.slice/crio-a8246c3fd9ebd0b94fdedb8c03c2f2956461e91807d324e62d6f6f6ea3dbc9b0 WatchSource:0}: Error finding container a8246c3fd9ebd0b94fdedb8c03c2f2956461e91807d324e62d6f6f6ea3dbc9b0: Status 404 returned error can't find the container with id a8246c3fd9ebd0b94fdedb8c03c2f2956461e91807d324e62d6f6f6ea3dbc9b0 Mar 20 15:45:40 crc kubenswrapper[4779]: I0320 15:45:40.744557 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:45:41 crc kubenswrapper[4779]: I0320 15:45:41.132076 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-557b5d954b-f5w4x"] Mar 20 15:45:41 crc kubenswrapper[4779]: I0320 15:45:41.342772 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b659b97bb-hbsns"] Mar 20 15:45:41 crc kubenswrapper[4779]: I0320 15:45:41.415561 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-lg5dh"] Mar 20 15:45:41 crc kubenswrapper[4779]: W0320 15:45:41.429911 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaafd9b1e_a336_4bc8_a389_512294752f12.slice/crio-ac6ab576ed9e54f9adb12631f5159d2f134cff2a2f2cad204e42453ef851b0b6 WatchSource:0}: Error finding container ac6ab576ed9e54f9adb12631f5159d2f134cff2a2f2cad204e42453ef851b0b6: Status 404 returned error can't find the container with id ac6ab576ed9e54f9adb12631f5159d2f134cff2a2f2cad204e42453ef851b0b6 Mar 20 15:45:41 crc kubenswrapper[4779]: I0320 15:45:41.589510 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" event={"ID":"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294","Type":"ContainerStarted","Data":"1b7a4390029d1bc9d3d19bbdf0ba22d10a899096b68470c147ca52a894e54c2c"} Mar 20 15:45:41 crc kubenswrapper[4779]: I0320 15:45:41.591660 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" event={"ID":"aafd9b1e-a336-4bc8-a389-512294752f12","Type":"ContainerStarted","Data":"ac6ab576ed9e54f9adb12631f5159d2f134cff2a2f2cad204e42453ef851b0b6"} Mar 20 15:45:41 crc kubenswrapper[4779]: I0320 15:45:41.594361 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57f6557d6c-cps7z" event={"ID":"0d481333-1c1f-49eb-a304-cfb5861d2cf4","Type":"ContainerStarted","Data":"a8246c3fd9ebd0b94fdedb8c03c2f2956461e91807d324e62d6f6f6ea3dbc9b0"} Mar 20 15:45:41 crc kubenswrapper[4779]: I0320 15:45:41.597521 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b659b97bb-hbsns" event={"ID":"0aae8b0d-a776-48ba-beae-55f8d81a4074","Type":"ContainerStarted","Data":"b58b351893728b118dc226119b4efb47f8db7cabc15ae765707840ed282bda2a"} Mar 20 15:45:41 crc kubenswrapper[4779]: I0320 15:45:41.597757 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:45:42 crc kubenswrapper[4779]: I0320 15:45:42.194737 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 20 15:45:42 crc kubenswrapper[4779]: I0320 15:45:42.617317 4779 generic.go:334] "Generic (PLEG): container finished" podID="aafd9b1e-a336-4bc8-a389-512294752f12" containerID="34db14e2a71e2f1f08c9e109e8a5bc3d5f6d27acc151e51f03a04005f555cb83" exitCode=0 Mar 20 15:45:42 crc kubenswrapper[4779]: I0320 15:45:42.617491 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" event={"ID":"aafd9b1e-a336-4bc8-a389-512294752f12","Type":"ContainerDied","Data":"34db14e2a71e2f1f08c9e109e8a5bc3d5f6d27acc151e51f03a04005f555cb83"} Mar 20 15:45:42 crc kubenswrapper[4779]: I0320 15:45:42.628589 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b659b97bb-hbsns" event={"ID":"0aae8b0d-a776-48ba-beae-55f8d81a4074","Type":"ContainerStarted","Data":"c1d701c197831000a7f429507bdefa2e82c1e3aaa842b48b1d7d44070fabd766"} Mar 20 15:45:42 crc kubenswrapper[4779]: I0320 15:45:42.628644 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b659b97bb-hbsns" event={"ID":"0aae8b0d-a776-48ba-beae-55f8d81a4074","Type":"ContainerStarted","Data":"08bfedecd9168cabe8870c47d041b6276c23b1aef2227e86484d55db2845502e"} Mar 20 15:45:42 crc kubenswrapper[4779]: I0320 15:45:42.629585 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:42 crc kubenswrapper[4779]: I0320 15:45:42.629662 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:42 crc kubenswrapper[4779]: I0320 15:45:42.692778 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b659b97bb-hbsns" podStartSLOduration=2.692749663 podStartE2EDuration="2.692749663s" podCreationTimestamp="2026-03-20 15:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:42.684604197 +0000 UTC m=+1359.647119997" watchObservedRunningTime="2026-03-20 15:45:42.692749663 +0000 UTC m=+1359.655265473" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.287024 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5bfc6968f8-nhbpx"] Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.289670 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.298261 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.298486 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.311022 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bfc6968f8-nhbpx"] Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.424718 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-internal-tls-certs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.424848 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkm2n\" (UniqueName: \"kubernetes.io/projected/c0cbf17b-b63c-429d-8f23-694e5c0d566a-kube-api-access-gkm2n\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.424911 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-config-data\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.424942 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-combined-ca-bundle\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.424986 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-public-tls-certs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.425050 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cbf17b-b63c-429d-8f23-694e5c0d566a-logs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.425069 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-config-data-custom\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.527433 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-internal-tls-certs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.527540 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkm2n\" (UniqueName: \"kubernetes.io/projected/c0cbf17b-b63c-429d-8f23-694e5c0d566a-kube-api-access-gkm2n\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.527576 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-config-data\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.527625 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-combined-ca-bundle\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.527669 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-public-tls-certs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.527702 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cbf17b-b63c-429d-8f23-694e5c0d566a-logs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.527719 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-config-data-custom\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.534583 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cbf17b-b63c-429d-8f23-694e5c0d566a-logs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.539118 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-public-tls-certs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.539354 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-combined-ca-bundle\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.539700 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-config-data-custom\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.541576 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-config-data\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.558847 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cbf17b-b63c-429d-8f23-694e5c0d566a-internal-tls-certs\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.559386 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkm2n\" (UniqueName: \"kubernetes.io/projected/c0cbf17b-b63c-429d-8f23-694e5c0d566a-kube-api-access-gkm2n\") pod \"barbican-api-5bfc6968f8-nhbpx\" (UID: \"c0cbf17b-b63c-429d-8f23-694e5c0d566a\") " pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:43 crc kubenswrapper[4779]: I0320 15:45:43.618058 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:44 crc kubenswrapper[4779]: I0320 15:45:44.660444 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" event={"ID":"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294","Type":"ContainerStarted","Data":"3ad38cf757942e6a1d21f2c7cc786e89d9c152474bcfd89fc68658042c451712"} Mar 20 15:45:44 crc kubenswrapper[4779]: I0320 15:45:44.669428 4779 generic.go:334] "Generic (PLEG): container finished" podID="130a37da-c17e-48dd-8712-f87c67f01852" containerID="5e89d3a4cbd0626117310c3f7cacc52a9b5a84cc4f4a764f17de36df44024385" exitCode=0 Mar 20 15:45:44 crc kubenswrapper[4779]: I0320 15:45:44.669521 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wjcm6" event={"ID":"130a37da-c17e-48dd-8712-f87c67f01852","Type":"ContainerDied","Data":"5e89d3a4cbd0626117310c3f7cacc52a9b5a84cc4f4a764f17de36df44024385"} Mar 20 15:45:44 crc kubenswrapper[4779]: I0320 15:45:44.675753 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" event={"ID":"aafd9b1e-a336-4bc8-a389-512294752f12","Type":"ContainerStarted","Data":"775ff2c5bd3d2668c7b33f8e6157fa7b1c4d10a5fa684118844646e0091a7c0e"} Mar 20 15:45:44 crc kubenswrapper[4779]: I0320 15:45:44.675981 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:44 crc kubenswrapper[4779]: I0320 15:45:44.715340 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" podStartSLOduration=5.7153224179999995 podStartE2EDuration="5.715322418s" podCreationTimestamp="2026-03-20 15:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:44.711505104 +0000 UTC m=+1361.674020904" watchObservedRunningTime="2026-03-20 15:45:44.715322418 +0000 UTC m=+1361.677838218" Mar 20 15:45:44 crc kubenswrapper[4779]: I0320 15:45:44.782446 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bfc6968f8-nhbpx"] Mar 20 15:45:44 crc kubenswrapper[4779]: I0320 15:45:44.990901 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.087676 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-dc78779bd-shkrw" podUID="dddea4aa-aba6-49f1-a8bc-6ce9850da26d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.689757 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bfc6968f8-nhbpx" event={"ID":"c0cbf17b-b63c-429d-8f23-694e5c0d566a","Type":"ContainerStarted","Data":"bb52ffee3961be960ac2582c86b356c3bca77bb60ec5348989b057ba6f5f16a5"} Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.689811 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bfc6968f8-nhbpx" event={"ID":"c0cbf17b-b63c-429d-8f23-694e5c0d566a","Type":"ContainerStarted","Data":"dcc70c052545c60f1351570e3642f38ef9c04ecb8ef04f23c3be8ef8ce128710"} Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.689822 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bfc6968f8-nhbpx" event={"ID":"c0cbf17b-b63c-429d-8f23-694e5c0d566a","Type":"ContainerStarted","Data":"4f2a281122af44747e2be87cb6f67021546892119062e8302b2dad418505c16f"} Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.689975 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.690012 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.696220 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57f6557d6c-cps7z" event={"ID":"0d481333-1c1f-49eb-a304-cfb5861d2cf4","Type":"ContainerStarted","Data":"54f1a0b04bd288b3dd5911b0498aa11ab8b3b82e1f5e03a2be4b4cf961b26596"} Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.696263 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57f6557d6c-cps7z" event={"ID":"0d481333-1c1f-49eb-a304-cfb5861d2cf4","Type":"ContainerStarted","Data":"476c4e5ccc589fa52974c1621569952ac91df272905e2e54805f235272a92b75"} Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.700197 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" event={"ID":"ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294","Type":"ContainerStarted","Data":"85e9f7c1f1f9b2bdcd786f7ba6a1e5c131bf9fcc93fd2bd3d544a0155cccb464"} Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.717676 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5bfc6968f8-nhbpx" podStartSLOduration=2.717660155 podStartE2EDuration="2.717660155s" podCreationTimestamp="2026-03-20 15:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:45.712807585 +0000 UTC m=+1362.675323395" watchObservedRunningTime="2026-03-20 15:45:45.717660155 +0000 UTC m=+1362.680175955" Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.758328 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-57f6557d6c-cps7z" podStartSLOduration=3.22449622 podStartE2EDuration="6.757732155s" podCreationTimestamp="2026-03-20 15:45:39 +0000 UTC" firstStartedPulling="2026-03-20 15:45:40.753703752 +0000 UTC m=+1357.716219552" lastFinishedPulling="2026-03-20 15:45:44.286939687 +0000 UTC m=+1361.249455487" observedRunningTime="2026-03-20 15:45:45.746032817 +0000 UTC m=+1362.708548617" watchObservedRunningTime="2026-03-20 15:45:45.757732155 +0000 UTC m=+1362.720247975" Mar 20 15:45:45 crc kubenswrapper[4779]: I0320 15:45:45.783611 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-557b5d954b-f5w4x" podStartSLOduration=3.649930868 podStartE2EDuration="6.783584044s" podCreationTimestamp="2026-03-20 15:45:39 +0000 UTC" firstStartedPulling="2026-03-20 15:45:41.152291729 +0000 UTC m=+1358.114807529" lastFinishedPulling="2026-03-20 15:45:44.285944915 +0000 UTC m=+1361.248460705" observedRunningTime="2026-03-20 15:45:45.77090154 +0000 UTC m=+1362.733417340" watchObservedRunningTime="2026-03-20 15:45:45.783584044 +0000 UTC m=+1362.746099844" Mar 20 15:45:48 crc kubenswrapper[4779]: I0320 15:45:48.564472 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:45:49 crc kubenswrapper[4779]: I0320 15:45:49.705983 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Mar 20 15:45:49 crc kubenswrapper[4779]: I0320 15:45:49.711647 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.015077 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.073906 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/130a37da-c17e-48dd-8712-f87c67f01852-etc-machine-id\") pod \"130a37da-c17e-48dd-8712-f87c67f01852\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.073962 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-db-sync-config-data\") pod \"130a37da-c17e-48dd-8712-f87c67f01852\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.073986 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-scripts\") pod \"130a37da-c17e-48dd-8712-f87c67f01852\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.074074 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-combined-ca-bundle\") pod \"130a37da-c17e-48dd-8712-f87c67f01852\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.074066 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/130a37da-c17e-48dd-8712-f87c67f01852-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "130a37da-c17e-48dd-8712-f87c67f01852" (UID: "130a37da-c17e-48dd-8712-f87c67f01852"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.074150 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p64mw\" (UniqueName: \"kubernetes.io/projected/130a37da-c17e-48dd-8712-f87c67f01852-kube-api-access-p64mw\") pod \"130a37da-c17e-48dd-8712-f87c67f01852\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.074184 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-config-data\") pod \"130a37da-c17e-48dd-8712-f87c67f01852\" (UID: \"130a37da-c17e-48dd-8712-f87c67f01852\") " Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.074557 4779 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/130a37da-c17e-48dd-8712-f87c67f01852-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.085310 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "130a37da-c17e-48dd-8712-f87c67f01852" (UID: "130a37da-c17e-48dd-8712-f87c67f01852"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.094092 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130a37da-c17e-48dd-8712-f87c67f01852-kube-api-access-p64mw" (OuterVolumeSpecName: "kube-api-access-p64mw") pod "130a37da-c17e-48dd-8712-f87c67f01852" (UID: "130a37da-c17e-48dd-8712-f87c67f01852"). InnerVolumeSpecName "kube-api-access-p64mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.096342 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-scripts" (OuterVolumeSpecName: "scripts") pod "130a37da-c17e-48dd-8712-f87c67f01852" (UID: "130a37da-c17e-48dd-8712-f87c67f01852"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.124310 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "130a37da-c17e-48dd-8712-f87c67f01852" (UID: "130a37da-c17e-48dd-8712-f87c67f01852"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.176337 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p64mw\" (UniqueName: \"kubernetes.io/projected/130a37da-c17e-48dd-8712-f87c67f01852-kube-api-access-p64mw\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.176937 4779 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.177046 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.177108 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.188224 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-config-data" (OuterVolumeSpecName: "config-data") pod "130a37da-c17e-48dd-8712-f87c67f01852" (UID: "130a37da-c17e-48dd-8712-f87c67f01852"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.279703 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130a37da-c17e-48dd-8712-f87c67f01852-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.573365 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.657485 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-gmd9z"] Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.657765 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" podUID="629705fe-cbdc-4366-96ef-045d9d3c44dd" containerName="dnsmasq-dns" containerID="cri-o://4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6" gracePeriod=10 Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.836367 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wjcm6" event={"ID":"130a37da-c17e-48dd-8712-f87c67f01852","Type":"ContainerDied","Data":"cc54b62ea5e44d7f19be551b756bd3f21fa59e99f887e3b037197c9ad779fff9"} Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.836409 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc54b62ea5e44d7f19be551b756bd3f21fa59e99f887e3b037197c9ad779fff9" Mar 20 15:45:50 crc kubenswrapper[4779]: I0320 15:45:50.836489 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wjcm6" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.305883 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:45:51 crc kubenswrapper[4779]: E0320 15:45:51.308464 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130a37da-c17e-48dd-8712-f87c67f01852" containerName="cinder-db-sync" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.308488 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="130a37da-c17e-48dd-8712-f87c67f01852" containerName="cinder-db-sync" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.308693 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="130a37da-c17e-48dd-8712-f87c67f01852" containerName="cinder-db-sync" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.318318 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.329018 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-c9kbg" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.329247 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.329277 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.329903 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.349581 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.389053 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w6689"] Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.390780 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w6689"] Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.390872 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419226 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-svc\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419299 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419324 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxmjj\" (UniqueName: \"kubernetes.io/projected/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-kube-api-access-pxmjj\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419342 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419367 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419393 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-config\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419448 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419464 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419488 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419505 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419538 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.419561 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlx4m\" (UniqueName: \"kubernetes.io/projected/07fedf37-d83d-4c66-96f7-8b9dab543e45-kube-api-access-tlx4m\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536600 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536647 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxmjj\" (UniqueName: \"kubernetes.io/projected/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-kube-api-access-pxmjj\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536661 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536685 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536707 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-config\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536759 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536777 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536799 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536821 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536856 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536877 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlx4m\" (UniqueName: \"kubernetes.io/projected/07fedf37-d83d-4c66-96f7-8b9dab543e45-kube-api-access-tlx4m\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.536948 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-svc\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.538249 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.538624 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.538674 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.538802 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-config\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.538853 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-svc\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.539351 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.547846 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.547920 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.573430 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlx4m\" (UniqueName: \"kubernetes.io/projected/07fedf37-d83d-4c66-96f7-8b9dab543e45-kube-api-access-tlx4m\") pod \"dnsmasq-dns-5784cf869f-w6689\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.574543 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.574864 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxmjj\" (UniqueName: \"kubernetes.io/projected/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-kube-api-access-pxmjj\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.583975 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.674729 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.675489 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.678763 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.685730 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.703442 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.731255 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.744312 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tc9s\" (UniqueName: \"kubernetes.io/projected/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-kube-api-access-4tc9s\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.744412 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.744462 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.744476 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data-custom\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.744506 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-scripts\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.744531 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-logs\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.744547 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.843077 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.847608 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tc9s\" (UniqueName: \"kubernetes.io/projected/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-kube-api-access-4tc9s\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.847790 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.847889 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.847912 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data-custom\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.847969 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-scripts\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.847999 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-logs\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.848021 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.850719 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-logs\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.850744 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.863090 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.866776 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data-custom\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.868642 4779 generic.go:334] "Generic (PLEG): container finished" podID="629705fe-cbdc-4366-96ef-045d9d3c44dd" containerID="4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6" exitCode=0 Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.868694 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" event={"ID":"629705fe-cbdc-4366-96ef-045d9d3c44dd","Type":"ContainerDied","Data":"4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6"} Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.868721 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" event={"ID":"629705fe-cbdc-4366-96ef-045d9d3c44dd","Type":"ContainerDied","Data":"3460d7a9322b651e4389627e8272335d39b481e9c2c5b03e0c7f999f7df2333d"} Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.868738 4779 scope.go:117] "RemoveContainer" containerID="4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.868887 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-gmd9z" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.870965 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-scripts\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.872008 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.896737 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tc9s\" (UniqueName: \"kubernetes.io/projected/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-kube-api-access-4tc9s\") pod \"cinder-api-0\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " pod="openstack/cinder-api-0" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.904137 4779 scope.go:117] "RemoveContainer" containerID="8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5" Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.949593 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ln4r\" (UniqueName: \"kubernetes.io/projected/629705fe-cbdc-4366-96ef-045d9d3c44dd-kube-api-access-7ln4r\") pod \"629705fe-cbdc-4366-96ef-045d9d3c44dd\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.949762 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-config\") pod \"629705fe-cbdc-4366-96ef-045d9d3c44dd\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.949796 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-nb\") pod \"629705fe-cbdc-4366-96ef-045d9d3c44dd\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.949832 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-sb\") pod \"629705fe-cbdc-4366-96ef-045d9d3c44dd\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.949859 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-swift-storage-0\") pod \"629705fe-cbdc-4366-96ef-045d9d3c44dd\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.949907 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-svc\") pod \"629705fe-cbdc-4366-96ef-045d9d3c44dd\" (UID: \"629705fe-cbdc-4366-96ef-045d9d3c44dd\") " Mar 20 15:45:51 crc kubenswrapper[4779]: I0320 15:45:51.971486 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/629705fe-cbdc-4366-96ef-045d9d3c44dd-kube-api-access-7ln4r" (OuterVolumeSpecName: "kube-api-access-7ln4r") pod "629705fe-cbdc-4366-96ef-045d9d3c44dd" (UID: "629705fe-cbdc-4366-96ef-045d9d3c44dd"). InnerVolumeSpecName "kube-api-access-7ln4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.037753 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.043859 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "629705fe-cbdc-4366-96ef-045d9d3c44dd" (UID: "629705fe-cbdc-4366-96ef-045d9d3c44dd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.054039 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ln4r\" (UniqueName: \"kubernetes.io/projected/629705fe-cbdc-4366-96ef-045d9d3c44dd-kube-api-access-7ln4r\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.054068 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.102543 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "629705fe-cbdc-4366-96ef-045d9d3c44dd" (UID: "629705fe-cbdc-4366-96ef-045d9d3c44dd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.103891 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "629705fe-cbdc-4366-96ef-045d9d3c44dd" (UID: "629705fe-cbdc-4366-96ef-045d9d3c44dd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.104530 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "629705fe-cbdc-4366-96ef-045d9d3c44dd" (UID: "629705fe-cbdc-4366-96ef-045d9d3c44dd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.155664 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.155702 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.155715 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.179970 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-config" (OuterVolumeSpecName: "config") pod "629705fe-cbdc-4366-96ef-045d9d3c44dd" (UID: "629705fe-cbdc-4366-96ef-045d9d3c44dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.257599 4779 scope.go:117] "RemoveContainer" containerID="4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.258620 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629705fe-cbdc-4366-96ef-045d9d3c44dd-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:52 crc kubenswrapper[4779]: E0320 15:45:52.267588 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6\": container with ID starting with 4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6 not found: ID does not exist" containerID="4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.267636 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6"} err="failed to get container status \"4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6\": rpc error: code = NotFound desc = could not find container \"4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6\": container with ID starting with 4edde08223c92751762c930b9ef9f300f906c3d84cb049370eaa0b6231e0bbe6 not found: ID does not exist" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.267669 4779 scope.go:117] "RemoveContainer" containerID="8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5" Mar 20 15:45:52 crc kubenswrapper[4779]: E0320 15:45:52.271303 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5\": container with ID starting with 8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5 not found: ID does not exist" containerID="8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.271341 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5"} err="failed to get container status \"8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5\": rpc error: code = NotFound desc = could not find container \"8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5\": container with ID starting with 8e76cf7536100978085cfe580511576f462163726e8b322aafc23fc27b43cbb5 not found: ID does not exist" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.395662 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.404755 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w6689"] Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.531044 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-gmd9z"] Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.552310 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-gmd9z"] Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.672984 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.890776 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf","Type":"ContainerStarted","Data":"d0797c3aeab313228c8ba5857cda6322bc07bda33a32e03840595dfbbfd765c9"} Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.893860 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerStarted","Data":"add76e1f005987e3456f39bbc59e62069e531c7bbae1999bb2f0d57a0ba6e43b"} Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.894009 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.893982 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="ceilometer-central-agent" containerID="cri-o://b9138cce36f647241a32181c1827909361881d304cb66a73b0568ffc81e78207" gracePeriod=30 Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.894092 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="proxy-httpd" containerID="cri-o://add76e1f005987e3456f39bbc59e62069e531c7bbae1999bb2f0d57a0ba6e43b" gracePeriod=30 Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.894151 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="sg-core" containerID="cri-o://3a416b6e3b56fa65fe127241ccfe0961d7b327d2cea7fac1af288fdec90ddba0" gracePeriod=30 Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.894183 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="ceilometer-notification-agent" containerID="cri-o://b7438c8703eed8876715f4b09a1a91fc80747e4a1d1da11173a1de34cd6894e0" gracePeriod=30 Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.898216 4779 generic.go:334] "Generic (PLEG): container finished" podID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerID="0373be9754c9e92f88604eccd132d95ef493ab8316875ca38271501b25050b55" exitCode=0 Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.898499 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w6689" event={"ID":"07fedf37-d83d-4c66-96f7-8b9dab543e45","Type":"ContainerDied","Data":"0373be9754c9e92f88604eccd132d95ef493ab8316875ca38271501b25050b55"} Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.898562 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w6689" event={"ID":"07fedf37-d83d-4c66-96f7-8b9dab543e45","Type":"ContainerStarted","Data":"9914c8bee9158fcb8bf2860b610e5c2ccb6e7c2b5d662d90d6250d92563c712f"} Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.902566 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f71f70d5-e85b-44d0-8a7b-7788ab4b0895","Type":"ContainerStarted","Data":"50345eeaaa47248ac042eb214a69586be1f0793ae7d189fd9c376352757cf1e6"} Mar 20 15:45:52 crc kubenswrapper[4779]: I0320 15:45:52.941214 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.439320811 podStartE2EDuration="1m7.941194669s" podCreationTimestamp="2026-03-20 15:44:45 +0000 UTC" firstStartedPulling="2026-03-20 15:44:46.97833929 +0000 UTC m=+1303.940855090" lastFinishedPulling="2026-03-20 15:45:51.480213148 +0000 UTC m=+1368.442728948" observedRunningTime="2026-03-20 15:45:52.922823384 +0000 UTC m=+1369.885339184" watchObservedRunningTime="2026-03-20 15:45:52.941194669 +0000 UTC m=+1369.903710469" Mar 20 15:45:53 crc kubenswrapper[4779]: I0320 15:45:53.650520 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:53 crc kubenswrapper[4779]: I0320 15:45:53.826347 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="629705fe-cbdc-4366-96ef-045d9d3c44dd" path="/var/lib/kubelet/pods/629705fe-cbdc-4366-96ef-045d9d3c44dd/volumes" Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.006696 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w6689" event={"ID":"07fedf37-d83d-4c66-96f7-8b9dab543e45","Type":"ContainerStarted","Data":"3c989f1298a779f1a951476723cd69c30850eb7c6bf3b8650d83df1962657960"} Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.008205 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.035718 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f71f70d5-e85b-44d0-8a7b-7788ab4b0895","Type":"ContainerStarted","Data":"2b61568ea5627a4e110f985611d31ae5f5cf80a99a33166769fead31146b23eb"} Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.038019 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-w6689" podStartSLOduration=3.038002011 podStartE2EDuration="3.038002011s" podCreationTimestamp="2026-03-20 15:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:54.035798106 +0000 UTC m=+1370.998313906" watchObservedRunningTime="2026-03-20 15:45:54.038002011 +0000 UTC m=+1371.000517811" Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.079970 4779 generic.go:334] "Generic (PLEG): container finished" podID="2caab026-6b85-4015-9190-29d8ef94a58f" containerID="add76e1f005987e3456f39bbc59e62069e531c7bbae1999bb2f0d57a0ba6e43b" exitCode=0 Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.080223 4779 generic.go:334] "Generic (PLEG): container finished" podID="2caab026-6b85-4015-9190-29d8ef94a58f" containerID="3a416b6e3b56fa65fe127241ccfe0961d7b327d2cea7fac1af288fdec90ddba0" exitCode=2 Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.080308 4779 generic.go:334] "Generic (PLEG): container finished" podID="2caab026-6b85-4015-9190-29d8ef94a58f" containerID="b9138cce36f647241a32181c1827909361881d304cb66a73b0568ffc81e78207" exitCode=0 Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.080393 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerDied","Data":"add76e1f005987e3456f39bbc59e62069e531c7bbae1999bb2f0d57a0ba6e43b"} Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.080475 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerDied","Data":"3a416b6e3b56fa65fe127241ccfe0961d7b327d2cea7fac1af288fdec90ddba0"} Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.080564 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerDied","Data":"b9138cce36f647241a32181c1827909361881d304cb66a73b0568ffc81e78207"} Mar 20 15:45:54 crc kubenswrapper[4779]: I0320 15:45:54.174575 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:45:55 crc kubenswrapper[4779]: I0320 15:45:55.098298 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f71f70d5-e85b-44d0-8a7b-7788ab4b0895","Type":"ContainerStarted","Data":"63c8266b0e308265b1f6b06b572a3391518afa19b93b4b152462979852b7db3f"} Mar 20 15:45:55 crc kubenswrapper[4779]: I0320 15:45:55.098780 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 15:45:55 crc kubenswrapper[4779]: I0320 15:45:55.105795 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf","Type":"ContainerStarted","Data":"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe"} Mar 20 15:45:55 crc kubenswrapper[4779]: I0320 15:45:55.126424 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.126403484 podStartE2EDuration="4.126403484s" podCreationTimestamp="2026-03-20 15:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:55.126380404 +0000 UTC m=+1372.088896214" watchObservedRunningTime="2026-03-20 15:45:55.126403484 +0000 UTC m=+1372.088919274" Mar 20 15:45:55 crc kubenswrapper[4779]: I0320 15:45:55.173454 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.115880 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf","Type":"ContainerStarted","Data":"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32"} Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.120557 4779 generic.go:334] "Generic (PLEG): container finished" podID="2caab026-6b85-4015-9190-29d8ef94a58f" containerID="b7438c8703eed8876715f4b09a1a91fc80747e4a1d1da11173a1de34cd6894e0" exitCode=0 Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.120588 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerDied","Data":"b7438c8703eed8876715f4b09a1a91fc80747e4a1d1da11173a1de34cd6894e0"} Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.120621 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2caab026-6b85-4015-9190-29d8ef94a58f","Type":"ContainerDied","Data":"d652bcb93b77a03aaf8ec73ff8d1dcfaa11075be78c5411f7a73ed46a3c5ec36"} Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.120645 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d652bcb93b77a03aaf8ec73ff8d1dcfaa11075be78c5411f7a73ed46a3c5ec36" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.144483 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.157747 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.96515446 podStartE2EDuration="5.157723169s" podCreationTimestamp="2026-03-20 15:45:51 +0000 UTC" firstStartedPulling="2026-03-20 15:45:52.418267597 +0000 UTC m=+1369.380783397" lastFinishedPulling="2026-03-20 15:45:53.610836306 +0000 UTC m=+1370.573352106" observedRunningTime="2026-03-20 15:45:56.145169268 +0000 UTC m=+1373.107685078" watchObservedRunningTime="2026-03-20 15:45:56.157723169 +0000 UTC m=+1373.120238969" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.269535 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-run-httpd\") pod \"2caab026-6b85-4015-9190-29d8ef94a58f\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.269651 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-combined-ca-bundle\") pod \"2caab026-6b85-4015-9190-29d8ef94a58f\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.269698 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-config-data\") pod \"2caab026-6b85-4015-9190-29d8ef94a58f\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.269738 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-log-httpd\") pod \"2caab026-6b85-4015-9190-29d8ef94a58f\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.269881 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-sg-core-conf-yaml\") pod \"2caab026-6b85-4015-9190-29d8ef94a58f\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.269907 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqjtr\" (UniqueName: \"kubernetes.io/projected/2caab026-6b85-4015-9190-29d8ef94a58f-kube-api-access-qqjtr\") pod \"2caab026-6b85-4015-9190-29d8ef94a58f\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.269941 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-scripts\") pod \"2caab026-6b85-4015-9190-29d8ef94a58f\" (UID: \"2caab026-6b85-4015-9190-29d8ef94a58f\") " Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.271997 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2caab026-6b85-4015-9190-29d8ef94a58f" (UID: "2caab026-6b85-4015-9190-29d8ef94a58f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.287599 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2caab026-6b85-4015-9190-29d8ef94a58f" (UID: "2caab026-6b85-4015-9190-29d8ef94a58f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.363567 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-scripts" (OuterVolumeSpecName: "scripts") pod "2caab026-6b85-4015-9190-29d8ef94a58f" (UID: "2caab026-6b85-4015-9190-29d8ef94a58f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.374483 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.374513 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.374524 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2caab026-6b85-4015-9190-29d8ef94a58f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.376366 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2caab026-6b85-4015-9190-29d8ef94a58f-kube-api-access-qqjtr" (OuterVolumeSpecName: "kube-api-access-qqjtr") pod "2caab026-6b85-4015-9190-29d8ef94a58f" (UID: "2caab026-6b85-4015-9190-29d8ef94a58f"). InnerVolumeSpecName "kube-api-access-qqjtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.416431 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2caab026-6b85-4015-9190-29d8ef94a58f" (UID: "2caab026-6b85-4015-9190-29d8ef94a58f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.476043 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.476078 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqjtr\" (UniqueName: \"kubernetes.io/projected/2caab026-6b85-4015-9190-29d8ef94a58f-kube-api-access-qqjtr\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.497004 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2caab026-6b85-4015-9190-29d8ef94a58f" (UID: "2caab026-6b85-4015-9190-29d8ef94a58f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.499327 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-config-data" (OuterVolumeSpecName: "config-data") pod "2caab026-6b85-4015-9190-29d8ef94a58f" (UID: "2caab026-6b85-4015-9190-29d8ef94a58f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.540333 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.577505 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.577556 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caab026-6b85-4015-9190-29d8ef94a58f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.669543 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bfc6968f8-nhbpx" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.675676 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.767543 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b659b97bb-hbsns"] Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.768088 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b659b97bb-hbsns" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api-log" containerID="cri-o://08bfedecd9168cabe8870c47d041b6276c23b1aef2227e86484d55db2845502e" gracePeriod=30 Mar 20 15:45:56 crc kubenswrapper[4779]: I0320 15:45:56.768258 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b659b97bb-hbsns" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api" containerID="cri-o://c1d701c197831000a7f429507bdefa2e82c1e3aaa842b48b1d7d44070fabd766" gracePeriod=30 Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.098038 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.098291 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api-log" containerID="cri-o://9b3010172e23c0e2fb22547ed52cdeebe05c5bb0ddf175c2f0118ece00124f45" gracePeriod=30 Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.098427 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api" containerID="cri-o://256c73fa67ac5fd0452c6071ac8471ab9f080a3ab4ea4409a79d0c323ee82509" gracePeriod=30 Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.136804 4779 generic.go:334] "Generic (PLEG): container finished" podID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerID="08bfedecd9168cabe8870c47d041b6276c23b1aef2227e86484d55db2845502e" exitCode=143 Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.137674 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b659b97bb-hbsns" event={"ID":"0aae8b0d-a776-48ba-beae-55f8d81a4074","Type":"ContainerDied","Data":"08bfedecd9168cabe8870c47d041b6276c23b1aef2227e86484d55db2845502e"} Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.138246 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerName="cinder-api-log" containerID="cri-o://2b61568ea5627a4e110f985611d31ae5f5cf80a99a33166769fead31146b23eb" gracePeriod=30 Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.138534 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.140064 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerName="cinder-api" containerID="cri-o://63c8266b0e308265b1f6b06b572a3391518afa19b93b4b152462979852b7db3f" gracePeriod=30 Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.201849 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.218584 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.222889 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:45:57 crc kubenswrapper[4779]: E0320 15:45:57.223424 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="ceilometer-notification-agent" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223450 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="ceilometer-notification-agent" Mar 20 15:45:57 crc kubenswrapper[4779]: E0320 15:45:57.223488 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="sg-core" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223497 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="sg-core" Mar 20 15:45:57 crc kubenswrapper[4779]: E0320 15:45:57.223516 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629705fe-cbdc-4366-96ef-045d9d3c44dd" containerName="init" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223526 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="629705fe-cbdc-4366-96ef-045d9d3c44dd" containerName="init" Mar 20 15:45:57 crc kubenswrapper[4779]: E0320 15:45:57.223541 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629705fe-cbdc-4366-96ef-045d9d3c44dd" containerName="dnsmasq-dns" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223549 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="629705fe-cbdc-4366-96ef-045d9d3c44dd" containerName="dnsmasq-dns" Mar 20 15:45:57 crc kubenswrapper[4779]: E0320 15:45:57.223570 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="proxy-httpd" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223578 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="proxy-httpd" Mar 20 15:45:57 crc kubenswrapper[4779]: E0320 15:45:57.223617 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="ceilometer-central-agent" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223626 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="ceilometer-central-agent" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223836 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="ceilometer-central-agent" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223860 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="ceilometer-notification-agent" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223873 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="proxy-httpd" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223893 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="629705fe-cbdc-4366-96ef-045d9d3c44dd" containerName="dnsmasq-dns" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.223903 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" containerName="sg-core" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.226092 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.235619 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.235882 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.236548 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.422177 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.422260 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-config-data\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.422280 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-scripts\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.422306 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-log-httpd\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.422332 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4smw\" (UniqueName: \"kubernetes.io/projected/e1733660-4123-48e3-a604-7bd5cce44699-kube-api-access-k4smw\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.422375 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-run-httpd\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.422401 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.524813 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.525389 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.525457 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-config-data\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.525496 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-scripts\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.525534 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-log-httpd\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.525565 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4smw\" (UniqueName: \"kubernetes.io/projected/e1733660-4123-48e3-a604-7bd5cce44699-kube-api-access-k4smw\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.525807 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-run-httpd\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.526357 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-run-httpd\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.526466 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-log-httpd\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.532667 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.535783 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-config-data\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.536161 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.541802 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-scripts\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.552840 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4smw\" (UniqueName: \"kubernetes.io/projected/e1733660-4123-48e3-a604-7bd5cce44699-kube-api-access-k4smw\") pod \"ceilometer-0\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.641421 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:45:57 crc kubenswrapper[4779]: I0320 15:45:57.838296 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2caab026-6b85-4015-9190-29d8ef94a58f" path="/var/lib/kubelet/pods/2caab026-6b85-4015-9190-29d8ef94a58f/volumes" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.179139 4779 generic.go:334] "Generic (PLEG): container finished" podID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerID="9b3010172e23c0e2fb22547ed52cdeebe05c5bb0ddf175c2f0118ece00124f45" exitCode=143 Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.179203 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84bd8c95-b533-46ea-b1e2-3a53ddaec70c","Type":"ContainerDied","Data":"9b3010172e23c0e2fb22547ed52cdeebe05c5bb0ddf175c2f0118ece00124f45"} Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.181069 4779 generic.go:334] "Generic (PLEG): container finished" podID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerID="63c8266b0e308265b1f6b06b572a3391518afa19b93b4b152462979852b7db3f" exitCode=0 Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.181084 4779 generic.go:334] "Generic (PLEG): container finished" podID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerID="2b61568ea5627a4e110f985611d31ae5f5cf80a99a33166769fead31146b23eb" exitCode=143 Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.181878 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f71f70d5-e85b-44d0-8a7b-7788ab4b0895","Type":"ContainerDied","Data":"63c8266b0e308265b1f6b06b572a3391518afa19b93b4b152462979852b7db3f"} Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.181903 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f71f70d5-e85b-44d0-8a7b-7788ab4b0895","Type":"ContainerDied","Data":"2b61568ea5627a4e110f985611d31ae5f5cf80a99a33166769fead31146b23eb"} Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.397460 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:45:58 crc kubenswrapper[4779]: W0320 15:45:58.537361 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1733660_4123_48e3_a604_7bd5cce44699.slice/crio-ad1d0d92a8eaeb7f25dc0163502e6733e3a42983398c2af9b346457e3578b34b WatchSource:0}: Error finding container ad1d0d92a8eaeb7f25dc0163502e6733e3a42983398c2af9b346457e3578b34b: Status 404 returned error can't find the container with id ad1d0d92a8eaeb7f25dc0163502e6733e3a42983398c2af9b346457e3578b34b Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.538637 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.550958 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data-custom\") pod \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551055 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-etc-machine-id\") pod \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551176 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-logs\") pod \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551208 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-scripts\") pod \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551225 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data\") pod \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551311 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-combined-ca-bundle\") pod \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551359 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tc9s\" (UniqueName: \"kubernetes.io/projected/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-kube-api-access-4tc9s\") pod \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\" (UID: \"f71f70d5-e85b-44d0-8a7b-7788ab4b0895\") " Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551447 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f71f70d5-e85b-44d0-8a7b-7788ab4b0895" (UID: "f71f70d5-e85b-44d0-8a7b-7788ab4b0895"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551760 4779 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.551865 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-logs" (OuterVolumeSpecName: "logs") pod "f71f70d5-e85b-44d0-8a7b-7788ab4b0895" (UID: "f71f70d5-e85b-44d0-8a7b-7788ab4b0895"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.573436 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-kube-api-access-4tc9s" (OuterVolumeSpecName: "kube-api-access-4tc9s") pod "f71f70d5-e85b-44d0-8a7b-7788ab4b0895" (UID: "f71f70d5-e85b-44d0-8a7b-7788ab4b0895"). InnerVolumeSpecName "kube-api-access-4tc9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.579355 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f71f70d5-e85b-44d0-8a7b-7788ab4b0895" (UID: "f71f70d5-e85b-44d0-8a7b-7788ab4b0895"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.590509 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-scripts" (OuterVolumeSpecName: "scripts") pod "f71f70d5-e85b-44d0-8a7b-7788ab4b0895" (UID: "f71f70d5-e85b-44d0-8a7b-7788ab4b0895"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.600241 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f71f70d5-e85b-44d0-8a7b-7788ab4b0895" (UID: "f71f70d5-e85b-44d0-8a7b-7788ab4b0895"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.632310 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data" (OuterVolumeSpecName: "config-data") pod "f71f70d5-e85b-44d0-8a7b-7788ab4b0895" (UID: "f71f70d5-e85b-44d0-8a7b-7788ab4b0895"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.657948 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.657990 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tc9s\" (UniqueName: \"kubernetes.io/projected/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-kube-api-access-4tc9s\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.658000 4779 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.658010 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.658019 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.658026 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71f70d5-e85b-44d0-8a7b-7788ab4b0895-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.764066 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:45:58 crc kubenswrapper[4779]: I0320 15:45:58.996774 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.189590 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerStarted","Data":"ad1d0d92a8eaeb7f25dc0163502e6733e3a42983398c2af9b346457e3578b34b"} Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.192200 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f71f70d5-e85b-44d0-8a7b-7788ab4b0895","Type":"ContainerDied","Data":"50345eeaaa47248ac042eb214a69586be1f0793ae7d189fd9c376352757cf1e6"} Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.192257 4779 scope.go:117] "RemoveContainer" containerID="63c8266b0e308265b1f6b06b572a3391518afa19b93b4b152462979852b7db3f" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.192405 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.228793 4779 scope.go:117] "RemoveContainer" containerID="2b61568ea5627a4e110f985611d31ae5f5cf80a99a33166769fead31146b23eb" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.251081 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.259331 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.294679 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:45:59 crc kubenswrapper[4779]: E0320 15:45:59.295349 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerName="cinder-api" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.295422 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerName="cinder-api" Mar 20 15:45:59 crc kubenswrapper[4779]: E0320 15:45:59.295504 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerName="cinder-api-log" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.295606 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerName="cinder-api-log" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.295853 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerName="cinder-api" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.295932 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" containerName="cinder-api-log" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.297101 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.300539 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.300549 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.300942 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.321134 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373218 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee20d921-f263-4089-b0c5-06ebfed15478-logs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373268 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-config-data-custom\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373312 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee20d921-f263-4089-b0c5-06ebfed15478-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373344 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-config-data\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373372 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxht\" (UniqueName: \"kubernetes.io/projected/ee20d921-f263-4089-b0c5-06ebfed15478-kube-api-access-gxxht\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373406 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-scripts\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373441 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373459 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.373479 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475715 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475800 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee20d921-f263-4089-b0c5-06ebfed15478-logs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475816 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-config-data-custom\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475854 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee20d921-f263-4089-b0c5-06ebfed15478-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475887 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-config-data\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475915 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxht\" (UniqueName: \"kubernetes.io/projected/ee20d921-f263-4089-b0c5-06ebfed15478-kube-api-access-gxxht\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475951 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-scripts\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475979 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.475994 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.477575 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee20d921-f263-4089-b0c5-06ebfed15478-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.479510 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee20d921-f263-4089-b0c5-06ebfed15478-logs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.489923 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.494838 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.495619 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-scripts\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.496126 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-config-data-custom\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.507048 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-config-data\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.507875 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee20d921-f263-4089-b0c5-06ebfed15478-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.525746 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxht\" (UniqueName: \"kubernetes.io/projected/ee20d921-f263-4089-b0c5-06ebfed15478-kube-api-access-gxxht\") pod \"cinder-api-0\" (UID: \"ee20d921-f263-4089-b0c5-06ebfed15478\") " pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.619572 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.847026 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71f70d5-e85b-44d0-8a7b-7788ab4b0895" path="/var/lib/kubelet/pods/f71f70d5-e85b-44d0-8a7b-7788ab4b0895/volumes" Mar 20 15:45:59 crc kubenswrapper[4779]: I0320 15:45:59.942593 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-854c6767b7-nlddt" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.011958 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86f74cdcdf-tmrx6"] Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.013276 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86f74cdcdf-tmrx6" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerName="neutron-api" containerID="cri-o://8e8c1bfb539dfe38c7d7a0504be76c3ab9f7a02428095eac04289b172bc61d31" gracePeriod=30 Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.013718 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86f74cdcdf-tmrx6" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerName="neutron-httpd" containerID="cri-o://e1bd8b8b7b0f4aa151ce9453673b2139df9175d3b89e268569546efe778b1ca3" gracePeriod=30 Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.150314 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567026-95m4d"] Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.171801 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-95m4d" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.175967 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.176892 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.177285 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.206000 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qqr5\" (UniqueName: \"kubernetes.io/projected/05082fa7-f09c-4834-a745-aae20aad01c2-kube-api-access-8qqr5\") pod \"auto-csr-approver-29567026-95m4d\" (UID: \"05082fa7-f09c-4834-a745-aae20aad01c2\") " pod="openshift-infra/auto-csr-approver-29567026-95m4d" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.243684 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b659b97bb-hbsns" event={"ID":"0aae8b0d-a776-48ba-beae-55f8d81a4074","Type":"ContainerDied","Data":"c1d701c197831000a7f429507bdefa2e82c1e3aaa842b48b1d7d44070fabd766"} Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.243725 4779 generic.go:334] "Generic (PLEG): container finished" podID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerID="c1d701c197831000a7f429507bdefa2e82c1e3aaa842b48b1d7d44070fabd766" exitCode=0 Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.244823 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerStarted","Data":"a2c4d40badf7dd445975d9694cf7736dc929029711427667c4fbb33ffa03be89"} Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.249935 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-95m4d"] Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.303191 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.309691 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qqr5\" (UniqueName: \"kubernetes.io/projected/05082fa7-f09c-4834-a745-aae20aad01c2-kube-api-access-8qqr5\") pod \"auto-csr-approver-29567026-95m4d\" (UID: \"05082fa7-f09c-4834-a745-aae20aad01c2\") " pod="openshift-infra/auto-csr-approver-29567026-95m4d" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.356567 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qqr5\" (UniqueName: \"kubernetes.io/projected/05082fa7-f09c-4834-a745-aae20aad01c2-kube-api-access-8qqr5\") pod \"auto-csr-approver-29567026-95m4d\" (UID: \"05082fa7-f09c-4834-a745-aae20aad01c2\") " pod="openshift-infra/auto-csr-approver-29567026-95m4d" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.522332 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-95m4d" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.564309 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.173:9322/\": read tcp 10.217.0.2:56558->10.217.0.173:9322: read: connection reset by peer" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.564635 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9322/\": read tcp 10.217.0.2:56570->10.217.0.173:9322: read: connection reset by peer" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.799039 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.936685 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-combined-ca-bundle\") pod \"0aae8b0d-a776-48ba-beae-55f8d81a4074\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.936789 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aae8b0d-a776-48ba-beae-55f8d81a4074-logs\") pod \"0aae8b0d-a776-48ba-beae-55f8d81a4074\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.936832 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data-custom\") pod \"0aae8b0d-a776-48ba-beae-55f8d81a4074\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.936989 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b68b9\" (UniqueName: \"kubernetes.io/projected/0aae8b0d-a776-48ba-beae-55f8d81a4074-kube-api-access-b68b9\") pod \"0aae8b0d-a776-48ba-beae-55f8d81a4074\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.937022 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data\") pod \"0aae8b0d-a776-48ba-beae-55f8d81a4074\" (UID: \"0aae8b0d-a776-48ba-beae-55f8d81a4074\") " Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.939910 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aae8b0d-a776-48ba-beae-55f8d81a4074-logs" (OuterVolumeSpecName: "logs") pod "0aae8b0d-a776-48ba-beae-55f8d81a4074" (UID: "0aae8b0d-a776-48ba-beae-55f8d81a4074"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.954136 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0aae8b0d-a776-48ba-beae-55f8d81a4074" (UID: "0aae8b0d-a776-48ba-beae-55f8d81a4074"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:00 crc kubenswrapper[4779]: I0320 15:46:00.962370 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aae8b0d-a776-48ba-beae-55f8d81a4074-kube-api-access-b68b9" (OuterVolumeSpecName: "kube-api-access-b68b9") pod "0aae8b0d-a776-48ba-beae-55f8d81a4074" (UID: "0aae8b0d-a776-48ba-beae-55f8d81a4074"). InnerVolumeSpecName "kube-api-access-b68b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.022418 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0aae8b0d-a776-48ba-beae-55f8d81a4074" (UID: "0aae8b0d-a776-48ba-beae-55f8d81a4074"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.040350 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.040381 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aae8b0d-a776-48ba-beae-55f8d81a4074-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.040390 4779 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.040399 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b68b9\" (UniqueName: \"kubernetes.io/projected/0aae8b0d-a776-48ba-beae-55f8d81a4074-kube-api-access-b68b9\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.076357 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data" (OuterVolumeSpecName: "config-data") pod "0aae8b0d-a776-48ba-beae-55f8d81a4074" (UID: "0aae8b0d-a776-48ba-beae-55f8d81a4074"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.147471 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aae8b0d-a776-48ba-beae-55f8d81a4074-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.268394 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b659b97bb-hbsns" event={"ID":"0aae8b0d-a776-48ba-beae-55f8d81a4074","Type":"ContainerDied","Data":"b58b351893728b118dc226119b4efb47f8db7cabc15ae765707840ed282bda2a"} Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.268445 4779 scope.go:117] "RemoveContainer" containerID="c1d701c197831000a7f429507bdefa2e82c1e3aaa842b48b1d7d44070fabd766" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.268572 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b659b97bb-hbsns" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.285515 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerStarted","Data":"eaf6d73c6a0a605deee22e07d7dc61acddfb0883fc75267d49f92175f91e3c38"} Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.298416 4779 generic.go:334] "Generic (PLEG): container finished" podID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerID="256c73fa67ac5fd0452c6071ac8471ab9f080a3ab4ea4409a79d0c323ee82509" exitCode=0 Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.298488 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84bd8c95-b533-46ea-b1e2-3a53ddaec70c","Type":"ContainerDied","Data":"256c73fa67ac5fd0452c6071ac8471ab9f080a3ab4ea4409a79d0c323ee82509"} Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.298515 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84bd8c95-b533-46ea-b1e2-3a53ddaec70c","Type":"ContainerDied","Data":"7179437bb61661cb2e564cf0023b826b5822280cca16a6261fe6b67d6a87a83e"} Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.298526 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7179437bb61661cb2e564cf0023b826b5822280cca16a6261fe6b67d6a87a83e" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.315359 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.327904 4779 generic.go:334] "Generic (PLEG): container finished" podID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerID="e1bd8b8b7b0f4aa151ce9453673b2139df9175d3b89e268569546efe778b1ca3" exitCode=0 Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.327991 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86f74cdcdf-tmrx6" event={"ID":"9752eda4-2b46-4251-bec7-5cbca2f6fd57","Type":"ContainerDied","Data":"e1bd8b8b7b0f4aa151ce9453673b2139df9175d3b89e268569546efe778b1ca3"} Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.330475 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b659b97bb-hbsns"] Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.342271 4779 scope.go:117] "RemoveContainer" containerID="08bfedecd9168cabe8870c47d041b6276c23b1aef2227e86484d55db2845502e" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.351225 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5b659b97bb-hbsns"] Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.351828 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4sv8\" (UniqueName: \"kubernetes.io/projected/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-kube-api-access-v4sv8\") pod \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.351983 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-logs\") pod \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.352075 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-config-data\") pod \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.352097 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-combined-ca-bundle\") pod \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.352137 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-custom-prometheus-ca\") pod \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\" (UID: \"84bd8c95-b533-46ea-b1e2-3a53ddaec70c\") " Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.352773 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ee20d921-f263-4089-b0c5-06ebfed15478","Type":"ContainerStarted","Data":"1e8d299a3ea0d60b9c62224d21d724e126f5965457946ba30ed19b1ac04f0759"} Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.356725 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-logs" (OuterVolumeSpecName: "logs") pod "84bd8c95-b533-46ea-b1e2-3a53ddaec70c" (UID: "84bd8c95-b533-46ea-b1e2-3a53ddaec70c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.380564 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-95m4d"] Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.415866 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-kube-api-access-v4sv8" (OuterVolumeSpecName: "kube-api-access-v4sv8") pod "84bd8c95-b533-46ea-b1e2-3a53ddaec70c" (UID: "84bd8c95-b533-46ea-b1e2-3a53ddaec70c"). InnerVolumeSpecName "kube-api-access-v4sv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.453164 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4sv8\" (UniqueName: \"kubernetes.io/projected/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-kube-api-access-v4sv8\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.453454 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.530265 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84bd8c95-b533-46ea-b1e2-3a53ddaec70c" (UID: "84bd8c95-b533-46ea-b1e2-3a53ddaec70c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.556486 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.611520 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "84bd8c95-b533-46ea-b1e2-3a53ddaec70c" (UID: "84bd8c95-b533-46ea-b1e2-3a53ddaec70c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.663084 4779 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.692384 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-config-data" (OuterVolumeSpecName: "config-data") pod "84bd8c95-b533-46ea-b1e2-3a53ddaec70c" (UID: "84bd8c95-b533-46ea-b1e2-3a53ddaec70c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.734273 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.766453 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd8c95-b533-46ea-b1e2-3a53ddaec70c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.853790 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" path="/var/lib/kubelet/pods/0aae8b0d-a776-48ba-beae-55f8d81a4074/volumes" Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.858691 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-lg5dh"] Mar 20 15:46:01 crc kubenswrapper[4779]: I0320 15:46:01.858990 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" podUID="aafd9b1e-a336-4bc8-a389-512294752f12" containerName="dnsmasq-dns" containerID="cri-o://775ff2c5bd3d2668c7b33f8e6157fa7b1c4d10a5fa684118844646e0091a7c0e" gracePeriod=10 Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.136790 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.209530 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.269192 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.272273 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.399873 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerStarted","Data":"98664af4d2e8ce4b269c1fb27c7b04c426a5a694708cf4027c200df3a2e34448"} Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.402238 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-95m4d" event={"ID":"05082fa7-f09c-4834-a745-aae20aad01c2","Type":"ContainerStarted","Data":"831e32d80a9ea67d7a053720db35dfad5c56f3d9b65641e6e03c501117d10c51"} Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.410649 4779 generic.go:334] "Generic (PLEG): container finished" podID="aafd9b1e-a336-4bc8-a389-512294752f12" containerID="775ff2c5bd3d2668c7b33f8e6157fa7b1c4d10a5fa684118844646e0091a7c0e" exitCode=0 Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.410734 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" event={"ID":"aafd9b1e-a336-4bc8-a389-512294752f12","Type":"ContainerDied","Data":"775ff2c5bd3d2668c7b33f8e6157fa7b1c4d10a5fa684118844646e0091a7c0e"} Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.422838 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ee20d921-f263-4089-b0c5-06ebfed15478","Type":"ContainerStarted","Data":"9b14b1764066c00067c6fc3ae8a72de34df46bd394d704f6373bdbd6a9d8a9e7"} Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.425318 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerName="cinder-scheduler" containerID="cri-o://c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe" gracePeriod=30 Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.425803 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerName="probe" containerID="cri-o://13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32" gracePeriod=30 Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.426048 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.482040 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.511748 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-dc78779bd-shkrw" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.521582 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.553820 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.645617 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:46:02 crc kubenswrapper[4779]: E0320 15:46:02.646160 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafd9b1e-a336-4bc8-a389-512294752f12" containerName="init" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646174 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafd9b1e-a336-4bc8-a389-512294752f12" containerName="init" Mar 20 15:46:02 crc kubenswrapper[4779]: E0320 15:46:02.646205 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafd9b1e-a336-4bc8-a389-512294752f12" containerName="dnsmasq-dns" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646211 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafd9b1e-a336-4bc8-a389-512294752f12" containerName="dnsmasq-dns" Mar 20 15:46:02 crc kubenswrapper[4779]: E0320 15:46:02.646227 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api-log" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646233 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api-log" Mar 20 15:46:02 crc kubenswrapper[4779]: E0320 15:46:02.646246 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api-log" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646252 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api-log" Mar 20 15:46:02 crc kubenswrapper[4779]: E0320 15:46:02.646259 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646264 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api" Mar 20 15:46:02 crc kubenswrapper[4779]: E0320 15:46:02.646304 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646311 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646489 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api-log" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646499 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646515 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646527 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="aafd9b1e-a336-4bc8-a389-512294752f12" containerName="dnsmasq-dns" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.646540 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" containerName="watcher-api-log" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.647963 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.652675 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.652870 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.653701 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.695754 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.707003 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-nb\") pod \"aafd9b1e-a336-4bc8-a389-512294752f12\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.708275 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-swift-storage-0\") pod \"aafd9b1e-a336-4bc8-a389-512294752f12\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.708341 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwrzf\" (UniqueName: \"kubernetes.io/projected/aafd9b1e-a336-4bc8-a389-512294752f12-kube-api-access-qwrzf\") pod \"aafd9b1e-a336-4bc8-a389-512294752f12\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.708390 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-config\") pod \"aafd9b1e-a336-4bc8-a389-512294752f12\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.708407 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-svc\") pod \"aafd9b1e-a336-4bc8-a389-512294752f12\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.708495 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-sb\") pod \"aafd9b1e-a336-4bc8-a389-512294752f12\" (UID: \"aafd9b1e-a336-4bc8-a389-512294752f12\") " Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.708847 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-config-data\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.708955 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl6ng\" (UniqueName: \"kubernetes.io/projected/ad68dcd5-048d-44fe-a6d7-583068d7b361-kube-api-access-fl6ng\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.709020 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.709053 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.709131 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.710507 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.710535 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad68dcd5-048d-44fe-a6d7-583068d7b361-logs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.716427 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aafd9b1e-a336-4bc8-a389-512294752f12-kube-api-access-qwrzf" (OuterVolumeSpecName: "kube-api-access-qwrzf") pod "aafd9b1e-a336-4bc8-a389-512294752f12" (UID: "aafd9b1e-a336-4bc8-a389-512294752f12"). InnerVolumeSpecName "kube-api-access-qwrzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815084 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-config-data\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815217 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl6ng\" (UniqueName: \"kubernetes.io/projected/ad68dcd5-048d-44fe-a6d7-583068d7b361-kube-api-access-fl6ng\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815282 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815313 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815367 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815483 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815508 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad68dcd5-048d-44fe-a6d7-583068d7b361-logs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815588 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwrzf\" (UniqueName: \"kubernetes.io/projected/aafd9b1e-a336-4bc8-a389-512294752f12-kube-api-access-qwrzf\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.815975 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad68dcd5-048d-44fe-a6d7-583068d7b361-logs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.833420 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f86b78896-7tfsm"] Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.833677 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon-log" containerID="cri-o://d7fe244e02765ceb3be959118ca1a70a4ac93f217d62f041833169123707116a" gracePeriod=30 Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.835483 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" containerID="cri-o://7ff747c74330698ccefe1a731244c171b039473c2090a2067e21822cd9b07277" gracePeriod=30 Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.845958 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.848443 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-config-data\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.853969 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.854553 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.860362 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.869020 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ad68dcd5-048d-44fe-a6d7-583068d7b361-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.870013 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl6ng\" (UniqueName: \"kubernetes.io/projected/ad68dcd5-048d-44fe-a6d7-583068d7b361-kube-api-access-fl6ng\") pod \"watcher-api-0\" (UID: \"ad68dcd5-048d-44fe-a6d7-583068d7b361\") " pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.902591 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aafd9b1e-a336-4bc8-a389-512294752f12" (UID: "aafd9b1e-a336-4bc8-a389-512294752f12"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.928377 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.975723 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 20 15:46:02 crc kubenswrapper[4779]: I0320 15:46:02.993638 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-config" (OuterVolumeSpecName: "config") pod "aafd9b1e-a336-4bc8-a389-512294752f12" (UID: "aafd9b1e-a336-4bc8-a389-512294752f12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.029884 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.033624 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aafd9b1e-a336-4bc8-a389-512294752f12" (UID: "aafd9b1e-a336-4bc8-a389-512294752f12"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.064649 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aafd9b1e-a336-4bc8-a389-512294752f12" (UID: "aafd9b1e-a336-4bc8-a389-512294752f12"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.131688 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aafd9b1e-a336-4bc8-a389-512294752f12" (UID: "aafd9b1e-a336-4bc8-a389-512294752f12"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.152464 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.152496 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.152508 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aafd9b1e-a336-4bc8-a389-512294752f12-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.482918 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.484232 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-lg5dh" event={"ID":"aafd9b1e-a336-4bc8-a389-512294752f12","Type":"ContainerDied","Data":"ac6ab576ed9e54f9adb12631f5159d2f134cff2a2f2cad204e42453ef851b0b6"} Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.484324 4779 scope.go:117] "RemoveContainer" containerID="775ff2c5bd3d2668c7b33f8e6157fa7b1c4d10a5fa684118844646e0091a7c0e" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.494372 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ee20d921-f263-4089-b0c5-06ebfed15478","Type":"ContainerStarted","Data":"67b9e9fba987158ca5be0f1947b664bf027ff14b0a264dfa2454f21e81f69de6"} Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.495547 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.519724 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6878c9db6b-r5g2t" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.547974 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.547955709 podStartE2EDuration="4.547955709s" podCreationTimestamp="2026-03-20 15:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:03.528607681 +0000 UTC m=+1380.491123471" watchObservedRunningTime="2026-03-20 15:46:03.547955709 +0000 UTC m=+1380.510471509" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.614463 4779 scope.go:117] "RemoveContainer" containerID="34db14e2a71e2f1f08c9e109e8a5bc3d5f6d27acc151e51f03a04005f555cb83" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.617763 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-lg5dh"] Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.635962 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-lg5dh"] Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.745338 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.849594 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84bd8c95-b533-46ea-b1e2-3a53ddaec70c" path="/var/lib/kubelet/pods/84bd8c95-b533-46ea-b1e2-3a53ddaec70c/volumes" Mar 20 15:46:03 crc kubenswrapper[4779]: I0320 15:46:03.883895 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aafd9b1e-a336-4bc8-a389-512294752f12" path="/var/lib/kubelet/pods/aafd9b1e-a336-4bc8-a389-512294752f12/volumes" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.099586 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.291742 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c69fcdb9d-ntwph" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.345051 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.377761 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6d97b75b9b-swgzv"] Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.378024 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6d97b75b9b-swgzv" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" containerName="placement-log" containerID="cri-o://9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03" gracePeriod=30 Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.378099 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6d97b75b9b-swgzv" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" containerName="placement-api" containerID="cri-o://0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c" gracePeriod=30 Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.455264 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-scripts\") pod \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.455326 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-combined-ca-bundle\") pod \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.455471 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data-custom\") pod \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.455556 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data\") pod \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.455586 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxmjj\" (UniqueName: \"kubernetes.io/projected/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-kube-api-access-pxmjj\") pod \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.455612 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-etc-machine-id\") pod \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\" (UID: \"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf\") " Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.455970 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" (UID: "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.456176 4779 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.462347 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-scripts" (OuterVolumeSpecName: "scripts") pod "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" (UID: "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.471095 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" (UID: "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.473089 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-kube-api-access-pxmjj" (OuterVolumeSpecName: "kube-api-access-pxmjj") pod "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" (UID: "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf"). InnerVolumeSpecName "kube-api-access-pxmjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.528948 4779 generic.go:334] "Generic (PLEG): container finished" podID="83d44457-fd0e-462a-b293-65eec2238d7b" containerID="9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03" exitCode=143 Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.529034 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d97b75b9b-swgzv" event={"ID":"83d44457-fd0e-462a-b293-65eec2238d7b","Type":"ContainerDied","Data":"9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.549516 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerStarted","Data":"8d05b28a51c5679160b367dde9fb8cbb6989e511c30d1f673cf752a56206ec7a"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.550965 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.558447 4779 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.558482 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxmjj\" (UniqueName: \"kubernetes.io/projected/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-kube-api-access-pxmjj\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.558496 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.566073 4779 generic.go:334] "Generic (PLEG): container finished" podID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerID="13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32" exitCode=0 Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.566161 4779 generic.go:334] "Generic (PLEG): container finished" podID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerID="c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe" exitCode=0 Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.566215 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf","Type":"ContainerDied","Data":"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.566251 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf","Type":"ContainerDied","Data":"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.566262 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc79e3f4-3c5e-49d3-a560-3a67366b7fcf","Type":"ContainerDied","Data":"d0797c3aeab313228c8ba5857cda6322bc07bda33a32e03840595dfbbfd765c9"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.566278 4779 scope.go:117] "RemoveContainer" containerID="13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.566410 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.592903 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6288272790000002 podStartE2EDuration="7.592881929s" podCreationTimestamp="2026-03-20 15:45:57 +0000 UTC" firstStartedPulling="2026-03-20 15:45:58.539516922 +0000 UTC m=+1375.502032722" lastFinishedPulling="2026-03-20 15:46:03.503571572 +0000 UTC m=+1380.466087372" observedRunningTime="2026-03-20 15:46:04.57551735 +0000 UTC m=+1381.538033150" watchObservedRunningTime="2026-03-20 15:46:04.592881929 +0000 UTC m=+1381.555397729" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.594256 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ad68dcd5-048d-44fe-a6d7-583068d7b361","Type":"ContainerStarted","Data":"57dfe195e5197b3a45c48d3de7e43dd0ef4617cf0149ed1ed11000a2156ddd02"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.594302 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ad68dcd5-048d-44fe-a6d7-583068d7b361","Type":"ContainerStarted","Data":"327bd1b48ce3425c779637a3d0db1465ea623389eb28cbb48a9e4529df261b8b"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.594311 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ad68dcd5-048d-44fe-a6d7-583068d7b361","Type":"ContainerStarted","Data":"f23fe3691c522a7544e2648841c611588005f7e5f1612c95d2c2e5b99bacb8a4"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.594798 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.605298 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" (UID: "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.609695 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-95m4d" event={"ID":"05082fa7-f09c-4834-a745-aae20aad01c2","Type":"ContainerStarted","Data":"a0297bcf430d9ce16e4c26f3b558f8f63d3533d0e0b3c7fde55a9b7c06babdb3"} Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.626660 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ad68dcd5-048d-44fe-a6d7-583068d7b361" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.188:9322/\": dial tcp 10.217.0.188:9322: connect: connection refused" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.627419 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.627400262 podStartE2EDuration="2.627400262s" podCreationTimestamp="2026-03-20 15:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:04.625660529 +0000 UTC m=+1381.588176329" watchObservedRunningTime="2026-03-20 15:46:04.627400262 +0000 UTC m=+1381.589916052" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.633774 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data" (OuterVolumeSpecName: "config-data") pod "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" (UID: "fc79e3f4-3c5e-49d3-a560-3a67366b7fcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.668056 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567026-95m4d" podStartSLOduration=3.279479775 podStartE2EDuration="4.668033616s" podCreationTimestamp="2026-03-20 15:46:00 +0000 UTC" firstStartedPulling="2026-03-20 15:46:01.462316923 +0000 UTC m=+1378.424832723" lastFinishedPulling="2026-03-20 15:46:02.850870764 +0000 UTC m=+1379.813386564" observedRunningTime="2026-03-20 15:46:04.647718023 +0000 UTC m=+1381.610233823" watchObservedRunningTime="2026-03-20 15:46:04.668033616 +0000 UTC m=+1381.630549416" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.671624 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.672186 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.752873 4779 scope.go:117] "RemoveContainer" containerID="c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.783382 4779 scope.go:117] "RemoveContainer" containerID="13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32" Mar 20 15:46:04 crc kubenswrapper[4779]: E0320 15:46:04.784190 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32\": container with ID starting with 13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32 not found: ID does not exist" containerID="13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.784235 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32"} err="failed to get container status \"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32\": rpc error: code = NotFound desc = could not find container \"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32\": container with ID starting with 13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32 not found: ID does not exist" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.784279 4779 scope.go:117] "RemoveContainer" containerID="c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe" Mar 20 15:46:04 crc kubenswrapper[4779]: E0320 15:46:04.784603 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe\": container with ID starting with c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe not found: ID does not exist" containerID="c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.784623 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe"} err="failed to get container status \"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe\": rpc error: code = NotFound desc = could not find container \"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe\": container with ID starting with c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe not found: ID does not exist" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.784639 4779 scope.go:117] "RemoveContainer" containerID="13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.784850 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32"} err="failed to get container status \"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32\": rpc error: code = NotFound desc = could not find container \"13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32\": container with ID starting with 13f919ad0c0d8fcccdb7c3c08f43c24c7942e2c5dab3f7d24b2a0c9bfdfccf32 not found: ID does not exist" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.784868 4779 scope.go:117] "RemoveContainer" containerID="c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.785083 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe"} err="failed to get container status \"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe\": rpc error: code = NotFound desc = could not find container \"c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe\": container with ID starting with c817eb9312c63cb5b9cb410b4377295c6b8fd369c17cde070946823579f7f2fe not found: ID does not exist" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.896863 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.908418 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.915269 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:46:04 crc kubenswrapper[4779]: E0320 15:46:04.915626 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerName="cinder-scheduler" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.915642 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerName="cinder-scheduler" Mar 20 15:46:04 crc kubenswrapper[4779]: E0320 15:46:04.915666 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerName="probe" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.915672 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerName="probe" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.915847 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerName="probe" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.915880 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" containerName="cinder-scheduler" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.916760 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.931174 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.934833 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.977427 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-scripts\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.977519 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.977673 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.977808 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzd2w\" (UniqueName: \"kubernetes.io/projected/7409303d-4c67-4f93-b2c8-c6be3336df88-kube-api-access-bzd2w\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.977942 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-config-data\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:04 crc kubenswrapper[4779]: I0320 15:46:04.978059 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7409303d-4c67-4f93-b2c8-c6be3336df88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.080149 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7409303d-4c67-4f93-b2c8-c6be3336df88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.080226 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-scripts\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.080305 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.080334 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7409303d-4c67-4f93-b2c8-c6be3336df88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.080357 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.080543 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzd2w\" (UniqueName: \"kubernetes.io/projected/7409303d-4c67-4f93-b2c8-c6be3336df88-kube-api-access-bzd2w\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.080677 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-config-data\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.085591 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-scripts\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.087317 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-config-data\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.089735 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.096443 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzd2w\" (UniqueName: \"kubernetes.io/projected/7409303d-4c67-4f93-b2c8-c6be3336df88-kube-api-access-bzd2w\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.107661 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7409303d-4c67-4f93-b2c8-c6be3336df88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7409303d-4c67-4f93-b2c8-c6be3336df88\") " pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.237348 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.600395 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b659b97bb-hbsns" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.180:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.600399 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b659b97bb-hbsns" podUID="0aae8b0d-a776-48ba-beae-55f8d81a4074" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.180:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.639262 4779 generic.go:334] "Generic (PLEG): container finished" podID="05082fa7-f09c-4834-a745-aae20aad01c2" containerID="a0297bcf430d9ce16e4c26f3b558f8f63d3533d0e0b3c7fde55a9b7c06babdb3" exitCode=0 Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.639380 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-95m4d" event={"ID":"05082fa7-f09c-4834-a745-aae20aad01c2","Type":"ContainerDied","Data":"a0297bcf430d9ce16e4c26f3b558f8f63d3533d0e0b3c7fde55a9b7c06babdb3"} Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.753431 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:46:05 crc kubenswrapper[4779]: W0320 15:46:05.764058 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7409303d_4c67_4f93_b2c8_c6be3336df88.slice/crio-1340a8a67e9ce422bf1576fc82be236eba3ba568dd473ab9a8a68cc929aaa247 WatchSource:0}: Error finding container 1340a8a67e9ce422bf1576fc82be236eba3ba568dd473ab9a8a68cc929aaa247: Status 404 returned error can't find the container with id 1340a8a67e9ce422bf1576fc82be236eba3ba568dd473ab9a8a68cc929aaa247 Mar 20 15:46:05 crc kubenswrapper[4779]: I0320 15:46:05.870863 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc79e3f4-3c5e-49d3-a560-3a67366b7fcf" path="/var/lib/kubelet/pods/fc79e3f4-3c5e-49d3-a560-3a67366b7fcf/volumes" Mar 20 15:46:06 crc kubenswrapper[4779]: I0320 15:46:06.565325 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:38570->10.217.0.161:8443: read: connection reset by peer" Mar 20 15:46:06 crc kubenswrapper[4779]: I0320 15:46:06.566250 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Mar 20 15:46:06 crc kubenswrapper[4779]: I0320 15:46:06.654443 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7409303d-4c67-4f93-b2c8-c6be3336df88","Type":"ContainerStarted","Data":"f50c0c9e7f324cfdb3b8be8de4daa70d90cb3ac751b58a39beba05e6c15a8806"} Mar 20 15:46:06 crc kubenswrapper[4779]: I0320 15:46:06.654790 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7409303d-4c67-4f93-b2c8-c6be3336df88","Type":"ContainerStarted","Data":"1340a8a67e9ce422bf1576fc82be236eba3ba568dd473ab9a8a68cc929aaa247"} Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.208847 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-95m4d" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.366161 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qqr5\" (UniqueName: \"kubernetes.io/projected/05082fa7-f09c-4834-a745-aae20aad01c2-kube-api-access-8qqr5\") pod \"05082fa7-f09c-4834-a745-aae20aad01c2\" (UID: \"05082fa7-f09c-4834-a745-aae20aad01c2\") " Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.372317 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05082fa7-f09c-4834-a745-aae20aad01c2-kube-api-access-8qqr5" (OuterVolumeSpecName: "kube-api-access-8qqr5") pod "05082fa7-f09c-4834-a745-aae20aad01c2" (UID: "05082fa7-f09c-4834-a745-aae20aad01c2"). InnerVolumeSpecName "kube-api-access-8qqr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.469384 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qqr5\" (UniqueName: \"kubernetes.io/projected/05082fa7-f09c-4834-a745-aae20aad01c2-kube-api-access-8qqr5\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.534485 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 15:46:07 crc kubenswrapper[4779]: E0320 15:46:07.534863 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05082fa7-f09c-4834-a745-aae20aad01c2" containerName="oc" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.534880 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="05082fa7-f09c-4834-a745-aae20aad01c2" containerName="oc" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.535121 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="05082fa7-f09c-4834-a745-aae20aad01c2" containerName="oc" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.535747 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.539215 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.539228 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.539641 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-mhfx6" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.552741 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.574785 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.575041 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jzp5\" (UniqueName: \"kubernetes.io/projected/0f9d7d13-ddb9-4aa6-8041-48f65d646169-kube-api-access-5jzp5\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.575061 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.575258 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.677483 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.677566 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.677594 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jzp5\" (UniqueName: \"kubernetes.io/projected/0f9d7d13-ddb9-4aa6-8041-48f65d646169-kube-api-access-5jzp5\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.677611 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.678269 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.686896 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.688431 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7409303d-4c67-4f93-b2c8-c6be3336df88","Type":"ContainerStarted","Data":"d12ae1ee6453535a11a6bf129279594899ed590d8f01a7b9c63cd554cd306a07"} Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.694722 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.711697 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jzp5\" (UniqueName: \"kubernetes.io/projected/0f9d7d13-ddb9-4aa6-8041-48f65d646169-kube-api-access-5jzp5\") pod \"openstackclient\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.724684 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.724664454 podStartE2EDuration="3.724664454s" podCreationTimestamp="2026-03-20 15:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:07.714062013 +0000 UTC m=+1384.676577823" watchObservedRunningTime="2026-03-20 15:46:07.724664454 +0000 UTC m=+1384.687180254" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.730445 4779 generic.go:334] "Generic (PLEG): container finished" podID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerID="8e8c1bfb539dfe38c7d7a0504be76c3ab9f7a02428095eac04289b172bc61d31" exitCode=0 Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.730536 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86f74cdcdf-tmrx6" event={"ID":"9752eda4-2b46-4251-bec7-5cbca2f6fd57","Type":"ContainerDied","Data":"8e8c1bfb539dfe38c7d7a0504be76c3ab9f7a02428095eac04289b172bc61d31"} Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.737929 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-95m4d" event={"ID":"05082fa7-f09c-4834-a745-aae20aad01c2","Type":"ContainerDied","Data":"831e32d80a9ea67d7a053720db35dfad5c56f3d9b65641e6e03c501117d10c51"} Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.737989 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="831e32d80a9ea67d7a053720db35dfad5c56f3d9b65641e6e03c501117d10c51" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.738065 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-95m4d" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.743934 4779 generic.go:334] "Generic (PLEG): container finished" podID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerID="7ff747c74330698ccefe1a731244c171b039473c2090a2067e21822cd9b07277" exitCode=0 Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.743976 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f86b78896-7tfsm" event={"ID":"f6de60e1-9be1-4bb6-9d26-0b496117ed20","Type":"ContainerDied","Data":"7ff747c74330698ccefe1a731244c171b039473c2090a2067e21822cd9b07277"} Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.799064 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.799872 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.843877 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.843937 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.849693 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.856805 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.978545 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.985739 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5191a987-7d86-4738-b0d2-e56edcff519e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.985928 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5191a987-7d86-4738-b0d2-e56edcff519e-openstack-config\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.986135 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5191a987-7d86-4738-b0d2-e56edcff519e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:07 crc kubenswrapper[4779]: I0320 15:46:07.986163 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6qgq\" (UniqueName: \"kubernetes.io/projected/5191a987-7d86-4738-b0d2-e56edcff519e-kube-api-access-p6qgq\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: E0320 15:46:08.072287 4779 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 15:46:08 crc kubenswrapper[4779]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_0f9d7d13-ddb9-4aa6-8041-48f65d646169_0(7e6529d7b14bd50825af798dfb1d6d5e518197211d66e8c68c6fd45ee31e6e38): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7e6529d7b14bd50825af798dfb1d6d5e518197211d66e8c68c6fd45ee31e6e38" Netns:"/var/run/netns/ba3340f8-a4fd-4757-97e7-008291cd98a8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=7e6529d7b14bd50825af798dfb1d6d5e518197211d66e8c68c6fd45ee31e6e38;K8S_POD_UID=0f9d7d13-ddb9-4aa6-8041-48f65d646169" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/0f9d7d13-ddb9-4aa6-8041-48f65d646169]: expected pod UID "0f9d7d13-ddb9-4aa6-8041-48f65d646169" but got "5191a987-7d86-4738-b0d2-e56edcff519e" from Kube API Mar 20 15:46:08 crc kubenswrapper[4779]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 15:46:08 crc kubenswrapper[4779]: > Mar 20 15:46:08 crc kubenswrapper[4779]: E0320 15:46:08.072381 4779 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 15:46:08 crc kubenswrapper[4779]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_0f9d7d13-ddb9-4aa6-8041-48f65d646169_0(7e6529d7b14bd50825af798dfb1d6d5e518197211d66e8c68c6fd45ee31e6e38): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7e6529d7b14bd50825af798dfb1d6d5e518197211d66e8c68c6fd45ee31e6e38" Netns:"/var/run/netns/ba3340f8-a4fd-4757-97e7-008291cd98a8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=7e6529d7b14bd50825af798dfb1d6d5e518197211d66e8c68c6fd45ee31e6e38;K8S_POD_UID=0f9d7d13-ddb9-4aa6-8041-48f65d646169" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/0f9d7d13-ddb9-4aa6-8041-48f65d646169]: expected pod UID "0f9d7d13-ddb9-4aa6-8041-48f65d646169" but got "5191a987-7d86-4738-b0d2-e56edcff519e" from Kube API Mar 20 15:46:08 crc kubenswrapper[4779]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 15:46:08 crc kubenswrapper[4779]: > pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.090391 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5191a987-7d86-4738-b0d2-e56edcff519e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.090441 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6qgq\" (UniqueName: \"kubernetes.io/projected/5191a987-7d86-4738-b0d2-e56edcff519e-kube-api-access-p6qgq\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.090492 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5191a987-7d86-4738-b0d2-e56edcff519e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.090570 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5191a987-7d86-4738-b0d2-e56edcff519e-openstack-config\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.091531 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5191a987-7d86-4738-b0d2-e56edcff519e-openstack-config\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.100551 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5191a987-7d86-4738-b0d2-e56edcff519e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.104698 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5191a987-7d86-4738-b0d2-e56edcff519e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.117265 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6qgq\" (UniqueName: \"kubernetes.io/projected/5191a987-7d86-4738-b0d2-e56edcff519e-kube-api-access-p6qgq\") pod \"openstackclient\" (UID: \"5191a987-7d86-4738-b0d2-e56edcff519e\") " pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.252583 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.324175 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567020-n5xz4"] Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.389710 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567020-n5xz4"] Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.418301 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.464753 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.607713 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d44457-fd0e-462a-b293-65eec2238d7b-logs\") pod \"83d44457-fd0e-462a-b293-65eec2238d7b\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.607823 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6fb8\" (UniqueName: \"kubernetes.io/projected/83d44457-fd0e-462a-b293-65eec2238d7b-kube-api-access-d6fb8\") pod \"83d44457-fd0e-462a-b293-65eec2238d7b\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.607862 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-config\") pod \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.607891 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-public-tls-certs\") pod \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.607962 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-config-data\") pod \"83d44457-fd0e-462a-b293-65eec2238d7b\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608036 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-internal-tls-certs\") pod \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608078 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-public-tls-certs\") pod \"83d44457-fd0e-462a-b293-65eec2238d7b\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608121 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-combined-ca-bundle\") pod \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608141 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggwcl\" (UniqueName: \"kubernetes.io/projected/9752eda4-2b46-4251-bec7-5cbca2f6fd57-kube-api-access-ggwcl\") pod \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608271 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-httpd-config\") pod \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608335 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-ovndb-tls-certs\") pod \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\" (UID: \"9752eda4-2b46-4251-bec7-5cbca2f6fd57\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608352 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-combined-ca-bundle\") pod \"83d44457-fd0e-462a-b293-65eec2238d7b\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608378 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-internal-tls-certs\") pod \"83d44457-fd0e-462a-b293-65eec2238d7b\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.608413 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-scripts\") pod \"83d44457-fd0e-462a-b293-65eec2238d7b\" (UID: \"83d44457-fd0e-462a-b293-65eec2238d7b\") " Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.609862 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83d44457-fd0e-462a-b293-65eec2238d7b-logs" (OuterVolumeSpecName: "logs") pod "83d44457-fd0e-462a-b293-65eec2238d7b" (UID: "83d44457-fd0e-462a-b293-65eec2238d7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.628400 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d44457-fd0e-462a-b293-65eec2238d7b-kube-api-access-d6fb8" (OuterVolumeSpecName: "kube-api-access-d6fb8") pod "83d44457-fd0e-462a-b293-65eec2238d7b" (UID: "83d44457-fd0e-462a-b293-65eec2238d7b"). InnerVolumeSpecName "kube-api-access-d6fb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.640745 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9752eda4-2b46-4251-bec7-5cbca2f6fd57-kube-api-access-ggwcl" (OuterVolumeSpecName: "kube-api-access-ggwcl") pod "9752eda4-2b46-4251-bec7-5cbca2f6fd57" (UID: "9752eda4-2b46-4251-bec7-5cbca2f6fd57"). InnerVolumeSpecName "kube-api-access-ggwcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.640833 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-scripts" (OuterVolumeSpecName: "scripts") pod "83d44457-fd0e-462a-b293-65eec2238d7b" (UID: "83d44457-fd0e-462a-b293-65eec2238d7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.640880 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9752eda4-2b46-4251-bec7-5cbca2f6fd57" (UID: "9752eda4-2b46-4251-bec7-5cbca2f6fd57"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.711088 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6fb8\" (UniqueName: \"kubernetes.io/projected/83d44457-fd0e-462a-b293-65eec2238d7b-kube-api-access-d6fb8\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.711152 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggwcl\" (UniqueName: \"kubernetes.io/projected/9752eda4-2b46-4251-bec7-5cbca2f6fd57-kube-api-access-ggwcl\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.711165 4779 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.711175 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.711187 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d44457-fd0e-462a-b293-65eec2238d7b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.719347 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9752eda4-2b46-4251-bec7-5cbca2f6fd57" (UID: "9752eda4-2b46-4251-bec7-5cbca2f6fd57"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.787714 4779 generic.go:334] "Generic (PLEG): container finished" podID="83d44457-fd0e-462a-b293-65eec2238d7b" containerID="0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c" exitCode=0 Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.787790 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d97b75b9b-swgzv" event={"ID":"83d44457-fd0e-462a-b293-65eec2238d7b","Type":"ContainerDied","Data":"0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c"} Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.787850 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d97b75b9b-swgzv" event={"ID":"83d44457-fd0e-462a-b293-65eec2238d7b","Type":"ContainerDied","Data":"d35173b68721f0b66623fe0581f162d2c79a6bd02df601e086ffb3d100373535"} Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.787870 4779 scope.go:117] "RemoveContainer" containerID="0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.788019 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d97b75b9b-swgzv" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.801443 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9752eda4-2b46-4251-bec7-5cbca2f6fd57" (UID: "9752eda4-2b46-4251-bec7-5cbca2f6fd57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.804801 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.805583 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86f74cdcdf-tmrx6" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.806049 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86f74cdcdf-tmrx6" event={"ID":"9752eda4-2b46-4251-bec7-5cbca2f6fd57","Type":"ContainerDied","Data":"69c87555657377b80c29de56354670bb12217637843c097d42fa2fda2851052e"} Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.810061 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.814844 4779 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.814888 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.853951 4779 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0f9d7d13-ddb9-4aa6-8041-48f65d646169" podUID="5191a987-7d86-4738-b0d2-e56edcff519e" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.869745 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-config" (OuterVolumeSpecName: "config") pod "9752eda4-2b46-4251-bec7-5cbca2f6fd57" (UID: "9752eda4-2b46-4251-bec7-5cbca2f6fd57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.893289 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83d44457-fd0e-462a-b293-65eec2238d7b" (UID: "83d44457-fd0e-462a-b293-65eec2238d7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.918314 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.918341 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.938243 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9752eda4-2b46-4251-bec7-5cbca2f6fd57" (UID: "9752eda4-2b46-4251-bec7-5cbca2f6fd57"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:08 crc kubenswrapper[4779]: I0320 15:46:08.964214 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9752eda4-2b46-4251-bec7-5cbca2f6fd57" (UID: "9752eda4-2b46-4251-bec7-5cbca2f6fd57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.000208 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-config-data" (OuterVolumeSpecName: "config-data") pod "83d44457-fd0e-462a-b293-65eec2238d7b" (UID: "83d44457-fd0e-462a-b293-65eec2238d7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.020461 4779 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.020497 4779 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9752eda4-2b46-4251-bec7-5cbca2f6fd57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.020507 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.023350 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "83d44457-fd0e-462a-b293-65eec2238d7b" (UID: "83d44457-fd0e-462a-b293-65eec2238d7b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.023368 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "83d44457-fd0e-462a-b293-65eec2238d7b" (UID: "83d44457-fd0e-462a-b293-65eec2238d7b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.051224 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.066039 4779 scope.go:117] "RemoveContainer" containerID="9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.093540 4779 scope.go:117] "RemoveContainer" containerID="0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c" Mar 20 15:46:09 crc kubenswrapper[4779]: E0320 15:46:09.094987 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c\": container with ID starting with 0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c not found: ID does not exist" containerID="0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.095024 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c"} err="failed to get container status \"0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c\": rpc error: code = NotFound desc = could not find container \"0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c\": container with ID starting with 0da158073c717f6594644c6e922de7c5ab2e60fc1422e809a7fb8fd76bea1b1c not found: ID does not exist" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.095048 4779 scope.go:117] "RemoveContainer" containerID="9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03" Mar 20 15:46:09 crc kubenswrapper[4779]: E0320 15:46:09.104263 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03\": container with ID starting with 9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03 not found: ID does not exist" containerID="9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.104327 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03"} err="failed to get container status \"9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03\": rpc error: code = NotFound desc = could not find container \"9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03\": container with ID starting with 9c787247cd4be3389dc5f475c53ce3577e3f1cca73903a0cf4c7409e281b1b03 not found: ID does not exist" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.104357 4779 scope.go:117] "RemoveContainer" containerID="e1bd8b8b7b0f4aa151ce9453673b2139df9175d3b89e268569546efe778b1ca3" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.134254 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6d97b75b9b-swgzv"] Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.136297 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-combined-ca-bundle\") pod \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.136371 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jzp5\" (UniqueName: \"kubernetes.io/projected/0f9d7d13-ddb9-4aa6-8041-48f65d646169-kube-api-access-5jzp5\") pod \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.136477 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config-secret\") pod \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.136557 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config\") pod \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\" (UID: \"0f9d7d13-ddb9-4aa6-8041-48f65d646169\") " Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.136830 4779 scope.go:117] "RemoveContainer" containerID="8e8c1bfb539dfe38c7d7a0504be76c3ab9f7a02428095eac04289b172bc61d31" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.140005 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0f9d7d13-ddb9-4aa6-8041-48f65d646169" (UID: "0f9d7d13-ddb9-4aa6-8041-48f65d646169"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.141862 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0f9d7d13-ddb9-4aa6-8041-48f65d646169" (UID: "0f9d7d13-ddb9-4aa6-8041-48f65d646169"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.142345 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.142373 4779 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.142384 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f9d7d13-ddb9-4aa6-8041-48f65d646169-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.142397 4779 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d44457-fd0e-462a-b293-65eec2238d7b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.146315 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f9d7d13-ddb9-4aa6-8041-48f65d646169-kube-api-access-5jzp5" (OuterVolumeSpecName: "kube-api-access-5jzp5") pod "0f9d7d13-ddb9-4aa6-8041-48f65d646169" (UID: "0f9d7d13-ddb9-4aa6-8041-48f65d646169"). InnerVolumeSpecName "kube-api-access-5jzp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.147640 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f9d7d13-ddb9-4aa6-8041-48f65d646169" (UID: "0f9d7d13-ddb9-4aa6-8041-48f65d646169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.155562 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6d97b75b9b-swgzv"] Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.169463 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86f74cdcdf-tmrx6"] Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.195019 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-86f74cdcdf-tmrx6"] Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.244675 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9d7d13-ddb9-4aa6-8041-48f65d646169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.244709 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jzp5\" (UniqueName: \"kubernetes.io/projected/0f9d7d13-ddb9-4aa6-8041-48f65d646169-kube-api-access-5jzp5\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.696352 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.823224 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f9d7d13-ddb9-4aa6-8041-48f65d646169" path="/var/lib/kubelet/pods/0f9d7d13-ddb9-4aa6-8041-48f65d646169/volumes" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.823712 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" path="/var/lib/kubelet/pods/83d44457-fd0e-462a-b293-65eec2238d7b/volumes" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.824527 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" path="/var/lib/kubelet/pods/9752eda4-2b46-4251-bec7-5cbca2f6fd57/volumes" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.825851 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e34ee163-b675-44d0-bb6f-25c122afd2fc" path="/var/lib/kubelet/pods/e34ee163-b675-44d0-bb6f-25c122afd2fc/volumes" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.828939 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.829196 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5191a987-7d86-4738-b0d2-e56edcff519e","Type":"ContainerStarted","Data":"1b3574980fe8d93b1cd8e11400f78a10baced64d4f074cfe0489b5718e431449"} Mar 20 15:46:09 crc kubenswrapper[4779]: I0320 15:46:09.845846 4779 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0f9d7d13-ddb9-4aa6-8041-48f65d646169" podUID="5191a987-7d86-4738-b0d2-e56edcff519e" Mar 20 15:46:10 crc kubenswrapper[4779]: I0320 15:46:10.238361 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 15:46:12 crc kubenswrapper[4779]: I0320 15:46:12.150591 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 15:46:12 crc kubenswrapper[4779]: I0320 15:46:12.977695 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Mar 20 15:46:12 crc kubenswrapper[4779]: I0320 15:46:12.988943 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Mar 20 15:46:13 crc kubenswrapper[4779]: I0320 15:46:13.902611 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 20 15:46:14 crc kubenswrapper[4779]: I0320 15:46:14.990889 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Mar 20 15:46:15 crc kubenswrapper[4779]: I0320 15:46:15.443170 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.720693 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5f8794546c-r7zls"] Mar 20 15:46:18 crc kubenswrapper[4779]: E0320 15:46:18.721464 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" containerName="placement-log" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.721482 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" containerName="placement-log" Mar 20 15:46:18 crc kubenswrapper[4779]: E0320 15:46:18.721506 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerName="neutron-httpd" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.721513 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerName="neutron-httpd" Mar 20 15:46:18 crc kubenswrapper[4779]: E0320 15:46:18.721539 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerName="neutron-api" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.721548 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerName="neutron-api" Mar 20 15:46:18 crc kubenswrapper[4779]: E0320 15:46:18.721567 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" containerName="placement-api" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.721575 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" containerName="placement-api" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.721760 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" containerName="placement-api" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.721768 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerName="neutron-httpd" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.721779 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9752eda4-2b46-4251-bec7-5cbca2f6fd57" containerName="neutron-api" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.721792 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d44457-fd0e-462a-b293-65eec2238d7b" containerName="placement-log" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.722825 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.725790 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.726693 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.748365 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.748584 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5f8794546c-r7zls"] Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.834179 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-combined-ca-bundle\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.834263 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-internal-tls-certs\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.834313 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-public-tls-certs\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.834361 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4b673b-e65e-49d0-92bd-138da686c2eb-log-httpd\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.834510 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9m72\" (UniqueName: \"kubernetes.io/projected/ad4b673b-e65e-49d0-92bd-138da686c2eb-kube-api-access-v9m72\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.834551 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-config-data\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.834596 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4b673b-e65e-49d0-92bd-138da686c2eb-run-httpd\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.834621 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad4b673b-e65e-49d0-92bd-138da686c2eb-etc-swift\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.878024 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.878417 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="ceilometer-central-agent" containerID="cri-o://a2c4d40badf7dd445975d9694cf7736dc929029711427667c4fbb33ffa03be89" gracePeriod=30 Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.879276 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="proxy-httpd" containerID="cri-o://8d05b28a51c5679160b367dde9fb8cbb6989e511c30d1f673cf752a56206ec7a" gracePeriod=30 Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.879341 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="sg-core" containerID="cri-o://98664af4d2e8ce4b269c1fb27c7b04c426a5a694708cf4027c200df3a2e34448" gracePeriod=30 Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.879387 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="ceilometer-notification-agent" containerID="cri-o://eaf6d73c6a0a605deee22e07d7dc61acddfb0883fc75267d49f92175f91e3c38" gracePeriod=30 Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.927486 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.185:3000/\": EOF" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.936139 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9m72\" (UniqueName: \"kubernetes.io/projected/ad4b673b-e65e-49d0-92bd-138da686c2eb-kube-api-access-v9m72\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.936216 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-config-data\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.936256 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4b673b-e65e-49d0-92bd-138da686c2eb-run-httpd\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.936276 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad4b673b-e65e-49d0-92bd-138da686c2eb-etc-swift\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.936350 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-combined-ca-bundle\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.936412 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-internal-tls-certs\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.936436 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-public-tls-certs\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.936476 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4b673b-e65e-49d0-92bd-138da686c2eb-log-httpd\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.937051 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4b673b-e65e-49d0-92bd-138da686c2eb-log-httpd\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.937786 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad4b673b-e65e-49d0-92bd-138da686c2eb-run-httpd\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.944999 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-combined-ca-bundle\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.946072 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-config-data\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.949791 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-internal-tls-certs\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.965340 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9m72\" (UniqueName: \"kubernetes.io/projected/ad4b673b-e65e-49d0-92bd-138da686c2eb-kube-api-access-v9m72\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.982807 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad4b673b-e65e-49d0-92bd-138da686c2eb-public-tls-certs\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:18 crc kubenswrapper[4779]: I0320 15:46:18.982820 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad4b673b-e65e-49d0-92bd-138da686c2eb-etc-swift\") pod \"swift-proxy-5f8794546c-r7zls\" (UID: \"ad4b673b-e65e-49d0-92bd-138da686c2eb\") " pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.093710 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.976033 4779 generic.go:334] "Generic (PLEG): container finished" podID="e1733660-4123-48e3-a604-7bd5cce44699" containerID="8d05b28a51c5679160b367dde9fb8cbb6989e511c30d1f673cf752a56206ec7a" exitCode=0 Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.976432 4779 generic.go:334] "Generic (PLEG): container finished" podID="e1733660-4123-48e3-a604-7bd5cce44699" containerID="98664af4d2e8ce4b269c1fb27c7b04c426a5a694708cf4027c200df3a2e34448" exitCode=2 Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.976447 4779 generic.go:334] "Generic (PLEG): container finished" podID="e1733660-4123-48e3-a604-7bd5cce44699" containerID="eaf6d73c6a0a605deee22e07d7dc61acddfb0883fc75267d49f92175f91e3c38" exitCode=0 Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.976454 4779 generic.go:334] "Generic (PLEG): container finished" podID="e1733660-4123-48e3-a604-7bd5cce44699" containerID="a2c4d40badf7dd445975d9694cf7736dc929029711427667c4fbb33ffa03be89" exitCode=0 Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.976230 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerDied","Data":"8d05b28a51c5679160b367dde9fb8cbb6989e511c30d1f673cf752a56206ec7a"} Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.976491 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerDied","Data":"98664af4d2e8ce4b269c1fb27c7b04c426a5a694708cf4027c200df3a2e34448"} Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.976504 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerDied","Data":"eaf6d73c6a0a605deee22e07d7dc61acddfb0883fc75267d49f92175f91e3c38"} Mar 20 15:46:19 crc kubenswrapper[4779]: I0320 15:46:19.976514 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerDied","Data":"a2c4d40badf7dd445975d9694cf7736dc929029711427667c4fbb33ffa03be89"} Mar 20 15:46:21 crc kubenswrapper[4779]: I0320 15:46:21.969127 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.007986 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5191a987-7d86-4738-b0d2-e56edcff519e","Type":"ContainerStarted","Data":"03576ebb5bcc02ae72e50c07f28d4566732e294873d5f7c2abba6628cbec0a87"} Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.028218 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1733660-4123-48e3-a604-7bd5cce44699","Type":"ContainerDied","Data":"ad1d0d92a8eaeb7f25dc0163502e6733e3a42983398c2af9b346457e3578b34b"} Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.028296 4779 scope.go:117] "RemoveContainer" containerID="8d05b28a51c5679160b367dde9fb8cbb6989e511c30d1f673cf752a56206ec7a" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.028529 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.060327 4779 scope.go:117] "RemoveContainer" containerID="98664af4d2e8ce4b269c1fb27c7b04c426a5a694708cf4027c200df3a2e34448" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.064442 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.244140142 podStartE2EDuration="15.064419047s" podCreationTimestamp="2026-03-20 15:46:07 +0000 UTC" firstStartedPulling="2026-03-20 15:46:08.860760358 +0000 UTC m=+1385.823276158" lastFinishedPulling="2026-03-20 15:46:21.681039263 +0000 UTC m=+1398.643555063" observedRunningTime="2026-03-20 15:46:22.054158472 +0000 UTC m=+1399.016674272" watchObservedRunningTime="2026-03-20 15:46:22.064419047 +0000 UTC m=+1399.026934847" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.088296 4779 scope.go:117] "RemoveContainer" containerID="eaf6d73c6a0a605deee22e07d7dc61acddfb0883fc75267d49f92175f91e3c38" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.094010 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-config-data\") pod \"e1733660-4123-48e3-a604-7bd5cce44699\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.094084 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-log-httpd\") pod \"e1733660-4123-48e3-a604-7bd5cce44699\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.094137 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-sg-core-conf-yaml\") pod \"e1733660-4123-48e3-a604-7bd5cce44699\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.094186 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-run-httpd\") pod \"e1733660-4123-48e3-a604-7bd5cce44699\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.094265 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-combined-ca-bundle\") pod \"e1733660-4123-48e3-a604-7bd5cce44699\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.094338 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4smw\" (UniqueName: \"kubernetes.io/projected/e1733660-4123-48e3-a604-7bd5cce44699-kube-api-access-k4smw\") pod \"e1733660-4123-48e3-a604-7bd5cce44699\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.097396 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-scripts\") pod \"e1733660-4123-48e3-a604-7bd5cce44699\" (UID: \"e1733660-4123-48e3-a604-7bd5cce44699\") " Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.099654 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e1733660-4123-48e3-a604-7bd5cce44699" (UID: "e1733660-4123-48e3-a604-7bd5cce44699"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.100243 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e1733660-4123-48e3-a604-7bd5cce44699" (UID: "e1733660-4123-48e3-a604-7bd5cce44699"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.114362 4779 scope.go:117] "RemoveContainer" containerID="a2c4d40badf7dd445975d9694cf7736dc929029711427667c4fbb33ffa03be89" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.115352 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1733660-4123-48e3-a604-7bd5cce44699-kube-api-access-k4smw" (OuterVolumeSpecName: "kube-api-access-k4smw") pod "e1733660-4123-48e3-a604-7bd5cce44699" (UID: "e1733660-4123-48e3-a604-7bd5cce44699"). InnerVolumeSpecName "kube-api-access-k4smw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.122837 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-scripts" (OuterVolumeSpecName: "scripts") pod "e1733660-4123-48e3-a604-7bd5cce44699" (UID: "e1733660-4123-48e3-a604-7bd5cce44699"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.152291 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e1733660-4123-48e3-a604-7bd5cce44699" (UID: "e1733660-4123-48e3-a604-7bd5cce44699"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.192392 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1733660-4123-48e3-a604-7bd5cce44699" (UID: "e1733660-4123-48e3-a604-7bd5cce44699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.200230 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.200269 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.200284 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4smw\" (UniqueName: \"kubernetes.io/projected/e1733660-4123-48e3-a604-7bd5cce44699-kube-api-access-k4smw\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.200296 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.200310 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1733660-4123-48e3-a604-7bd5cce44699-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.200323 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.226089 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-config-data" (OuterVolumeSpecName: "config-data") pod "e1733660-4123-48e3-a604-7bd5cce44699" (UID: "e1733660-4123-48e3-a604-7bd5cce44699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.302013 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1733660-4123-48e3-a604-7bd5cce44699-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.362497 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5f8794546c-r7zls"] Mar 20 15:46:22 crc kubenswrapper[4779]: W0320 15:46:22.366272 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad4b673b_e65e_49d0_92bd_138da686c2eb.slice/crio-1535d8c7079cff82790196b2e80df6fc21c4f435339754c5bbb5994606be9744 WatchSource:0}: Error finding container 1535d8c7079cff82790196b2e80df6fc21c4f435339754c5bbb5994606be9744: Status 404 returned error can't find the container with id 1535d8c7079cff82790196b2e80df6fc21c4f435339754c5bbb5994606be9744 Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.377353 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.386345 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395077 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:22 crc kubenswrapper[4779]: E0320 15:46:22.395480 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="proxy-httpd" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395498 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="proxy-httpd" Mar 20 15:46:22 crc kubenswrapper[4779]: E0320 15:46:22.395526 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="ceilometer-central-agent" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395533 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="ceilometer-central-agent" Mar 20 15:46:22 crc kubenswrapper[4779]: E0320 15:46:22.395549 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="ceilometer-notification-agent" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395555 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="ceilometer-notification-agent" Mar 20 15:46:22 crc kubenswrapper[4779]: E0320 15:46:22.395566 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="sg-core" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395572 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="sg-core" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395733 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="ceilometer-central-agent" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395763 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="ceilometer-notification-agent" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395773 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="sg-core" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.395782 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1733660-4123-48e3-a604-7bd5cce44699" containerName="proxy-httpd" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.398817 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.403710 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.404030 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.419577 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.607934 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-log-httpd\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.607996 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2cs9\" (UniqueName: \"kubernetes.io/projected/117f36dc-cf3e-402d-9303-859b0ff0ac93-kube-api-access-t2cs9\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.608024 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-scripts\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.608054 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.608166 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.608228 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-config-data\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.608257 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-run-httpd\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.709994 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.710872 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-config-data\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.711021 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-run-httpd\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.711165 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-log-httpd\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.711249 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2cs9\" (UniqueName: \"kubernetes.io/projected/117f36dc-cf3e-402d-9303-859b0ff0ac93-kube-api-access-t2cs9\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.711329 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-scripts\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.711406 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.711584 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-log-httpd\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.711822 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-run-httpd\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.714549 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.715041 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-config-data\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.715316 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-scripts\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.716813 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.729225 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2cs9\" (UniqueName: \"kubernetes.io/projected/117f36dc-cf3e-402d-9303-859b0ff0ac93-kube-api-access-t2cs9\") pod \"ceilometer-0\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " pod="openstack/ceilometer-0" Mar 20 15:46:22 crc kubenswrapper[4779]: I0320 15:46:22.797526 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:23 crc kubenswrapper[4779]: I0320 15:46:23.249286 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:23 crc kubenswrapper[4779]: I0320 15:46:23.292755 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f8794546c-r7zls" event={"ID":"ad4b673b-e65e-49d0-92bd-138da686c2eb","Type":"ContainerStarted","Data":"a4349723c3236afa9f86b5893d0fe4c5f43ba3dcc4fd087d973c286e8c6d5117"} Mar 20 15:46:23 crc kubenswrapper[4779]: I0320 15:46:23.292793 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f8794546c-r7zls" event={"ID":"ad4b673b-e65e-49d0-92bd-138da686c2eb","Type":"ContainerStarted","Data":"1535d8c7079cff82790196b2e80df6fc21c4f435339754c5bbb5994606be9744"} Mar 20 15:46:23 crc kubenswrapper[4779]: W0320 15:46:23.562000 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod117f36dc_cf3e_402d_9303_859b0ff0ac93.slice/crio-65eefaf77c71611b2354a81e15f7a61f8cc9621fd801627cbcb6c84a4b7a4053 WatchSource:0}: Error finding container 65eefaf77c71611b2354a81e15f7a61f8cc9621fd801627cbcb6c84a4b7a4053: Status 404 returned error can't find the container with id 65eefaf77c71611b2354a81e15f7a61f8cc9621fd801627cbcb6c84a4b7a4053 Mar 20 15:46:23 crc kubenswrapper[4779]: I0320 15:46:23.821342 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1733660-4123-48e3-a604-7bd5cce44699" path="/var/lib/kubelet/pods/e1733660-4123-48e3-a604-7bd5cce44699/volumes" Mar 20 15:46:24 crc kubenswrapper[4779]: I0320 15:46:24.049917 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:24 crc kubenswrapper[4779]: I0320 15:46:24.301895 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f8794546c-r7zls" event={"ID":"ad4b673b-e65e-49d0-92bd-138da686c2eb","Type":"ContainerStarted","Data":"24defdd5b78a8b9968d61c06510dae866dafcac11907eaafc1ef6da78f85d4c3"} Mar 20 15:46:24 crc kubenswrapper[4779]: I0320 15:46:24.303061 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:24 crc kubenswrapper[4779]: I0320 15:46:24.303090 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:24 crc kubenswrapper[4779]: I0320 15:46:24.304566 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerStarted","Data":"5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d"} Mar 20 15:46:24 crc kubenswrapper[4779]: I0320 15:46:24.304593 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerStarted","Data":"65eefaf77c71611b2354a81e15f7a61f8cc9621fd801627cbcb6c84a4b7a4053"} Mar 20 15:46:24 crc kubenswrapper[4779]: I0320 15:46:24.324474 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5f8794546c-r7zls" podStartSLOduration=6.324450301 podStartE2EDuration="6.324450301s" podCreationTimestamp="2026-03-20 15:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:24.319787456 +0000 UTC m=+1401.282303276" watchObservedRunningTime="2026-03-20 15:46:24.324450301 +0000 UTC m=+1401.286966101" Mar 20 15:46:24 crc kubenswrapper[4779]: I0320 15:46:24.991457 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f86b78896-7tfsm" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Mar 20 15:46:25 crc kubenswrapper[4779]: I0320 15:46:25.149675 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:46:25 crc kubenswrapper[4779]: I0320 15:46:25.149743 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:46:25 crc kubenswrapper[4779]: I0320 15:46:25.316363 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerStarted","Data":"bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5"} Mar 20 15:46:26 crc kubenswrapper[4779]: I0320 15:46:26.328132 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerStarted","Data":"ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3"} Mar 20 15:46:28 crc kubenswrapper[4779]: I0320 15:46:28.352018 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerStarted","Data":"d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e"} Mar 20 15:46:28 crc kubenswrapper[4779]: I0320 15:46:28.352222 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="ceilometer-central-agent" containerID="cri-o://5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d" gracePeriod=30 Mar 20 15:46:28 crc kubenswrapper[4779]: I0320 15:46:28.352328 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="ceilometer-notification-agent" containerID="cri-o://bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5" gracePeriod=30 Mar 20 15:46:28 crc kubenswrapper[4779]: I0320 15:46:28.352282 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="sg-core" containerID="cri-o://ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3" gracePeriod=30 Mar 20 15:46:28 crc kubenswrapper[4779]: I0320 15:46:28.352326 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="proxy-httpd" containerID="cri-o://d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e" gracePeriod=30 Mar 20 15:46:28 crc kubenswrapper[4779]: I0320 15:46:28.352523 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:46:28 crc kubenswrapper[4779]: I0320 15:46:28.390533 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.540719455 podStartE2EDuration="6.390513402s" podCreationTimestamp="2026-03-20 15:46:22 +0000 UTC" firstStartedPulling="2026-03-20 15:46:23.565234011 +0000 UTC m=+1400.527749811" lastFinishedPulling="2026-03-20 15:46:27.415027958 +0000 UTC m=+1404.377543758" observedRunningTime="2026-03-20 15:46:28.383892989 +0000 UTC m=+1405.346408799" watchObservedRunningTime="2026-03-20 15:46:28.390513402 +0000 UTC m=+1405.353029202" Mar 20 15:46:29 crc kubenswrapper[4779]: I0320 15:46:29.102061 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:29 crc kubenswrapper[4779]: I0320 15:46:29.103680 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5f8794546c-r7zls" Mar 20 15:46:29 crc kubenswrapper[4779]: I0320 15:46:29.388360 4779 generic.go:334] "Generic (PLEG): container finished" podID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerID="d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e" exitCode=0 Mar 20 15:46:29 crc kubenswrapper[4779]: I0320 15:46:29.388389 4779 generic.go:334] "Generic (PLEG): container finished" podID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerID="ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3" exitCode=2 Mar 20 15:46:29 crc kubenswrapper[4779]: I0320 15:46:29.389300 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerDied","Data":"d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e"} Mar 20 15:46:29 crc kubenswrapper[4779]: I0320 15:46:29.389325 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerDied","Data":"ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3"} Mar 20 15:46:30 crc kubenswrapper[4779]: I0320 15:46:30.398697 4779 generic.go:334] "Generic (PLEG): container finished" podID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerID="bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5" exitCode=0 Mar 20 15:46:30 crc kubenswrapper[4779]: I0320 15:46:30.398761 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerDied","Data":"bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5"} Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.412279 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.417372 4779 generic.go:334] "Generic (PLEG): container finished" podID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerID="5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d" exitCode=0 Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.417416 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerDied","Data":"5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d"} Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.417463 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.417506 4779 scope.go:117] "RemoveContainer" containerID="d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.417476 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"117f36dc-cf3e-402d-9303-859b0ff0ac93","Type":"ContainerDied","Data":"65eefaf77c71611b2354a81e15f7a61f8cc9621fd801627cbcb6c84a4b7a4053"} Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.435452 4779 scope.go:117] "RemoveContainer" containerID="ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.457665 4779 scope.go:117] "RemoveContainer" containerID="bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.479601 4779 scope.go:117] "RemoveContainer" containerID="5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.485808 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-combined-ca-bundle\") pod \"117f36dc-cf3e-402d-9303-859b0ff0ac93\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.485914 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-sg-core-conf-yaml\") pod \"117f36dc-cf3e-402d-9303-859b0ff0ac93\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.486061 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-run-httpd\") pod \"117f36dc-cf3e-402d-9303-859b0ff0ac93\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.486091 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-scripts\") pod \"117f36dc-cf3e-402d-9303-859b0ff0ac93\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.486148 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2cs9\" (UniqueName: \"kubernetes.io/projected/117f36dc-cf3e-402d-9303-859b0ff0ac93-kube-api-access-t2cs9\") pod \"117f36dc-cf3e-402d-9303-859b0ff0ac93\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.486168 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-config-data\") pod \"117f36dc-cf3e-402d-9303-859b0ff0ac93\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.486221 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-log-httpd\") pod \"117f36dc-cf3e-402d-9303-859b0ff0ac93\" (UID: \"117f36dc-cf3e-402d-9303-859b0ff0ac93\") " Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.486887 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "117f36dc-cf3e-402d-9303-859b0ff0ac93" (UID: "117f36dc-cf3e-402d-9303-859b0ff0ac93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.487347 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "117f36dc-cf3e-402d-9303-859b0ff0ac93" (UID: "117f36dc-cf3e-402d-9303-859b0ff0ac93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.495529 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117f36dc-cf3e-402d-9303-859b0ff0ac93-kube-api-access-t2cs9" (OuterVolumeSpecName: "kube-api-access-t2cs9") pod "117f36dc-cf3e-402d-9303-859b0ff0ac93" (UID: "117f36dc-cf3e-402d-9303-859b0ff0ac93"). InnerVolumeSpecName "kube-api-access-t2cs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.500054 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-scripts" (OuterVolumeSpecName: "scripts") pod "117f36dc-cf3e-402d-9303-859b0ff0ac93" (UID: "117f36dc-cf3e-402d-9303-859b0ff0ac93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.506738 4779 scope.go:117] "RemoveContainer" containerID="d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e" Mar 20 15:46:32 crc kubenswrapper[4779]: E0320 15:46:32.512709 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e\": container with ID starting with d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e not found: ID does not exist" containerID="d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.512762 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e"} err="failed to get container status \"d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e\": rpc error: code = NotFound desc = could not find container \"d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e\": container with ID starting with d8e50ba182429bb63ae92b4e21d3c4889a09e7585ce22dc8086955b94392329e not found: ID does not exist" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.512794 4779 scope.go:117] "RemoveContainer" containerID="ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3" Mar 20 15:46:32 crc kubenswrapper[4779]: E0320 15:46:32.515373 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3\": container with ID starting with ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3 not found: ID does not exist" containerID="ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.515412 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3"} err="failed to get container status \"ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3\": rpc error: code = NotFound desc = could not find container \"ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3\": container with ID starting with ddfb5ca2d9e911e06097f8d72a765077634e294cf351bacd1a8956a1050c5bd3 not found: ID does not exist" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.515436 4779 scope.go:117] "RemoveContainer" containerID="bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5" Mar 20 15:46:32 crc kubenswrapper[4779]: E0320 15:46:32.516913 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5\": container with ID starting with bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5 not found: ID does not exist" containerID="bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.516948 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5"} err="failed to get container status \"bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5\": rpc error: code = NotFound desc = could not find container \"bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5\": container with ID starting with bd79185c80630947cf0d5f234d506e0159841409fcb8c7ba6b0e54e6e59358c5 not found: ID does not exist" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.516965 4779 scope.go:117] "RemoveContainer" containerID="5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d" Mar 20 15:46:32 crc kubenswrapper[4779]: E0320 15:46:32.517591 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d\": container with ID starting with 5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d not found: ID does not exist" containerID="5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.517637 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d"} err="failed to get container status \"5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d\": rpc error: code = NotFound desc = could not find container \"5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d\": container with ID starting with 5781851f820eff5756b56ad36ca6907c6b4410a90c48cff9fbc4e0e2b3c96d1d not found: ID does not exist" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.575220 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "117f36dc-cf3e-402d-9303-859b0ff0ac93" (UID: "117f36dc-cf3e-402d-9303-859b0ff0ac93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.580257 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "117f36dc-cf3e-402d-9303-859b0ff0ac93" (UID: "117f36dc-cf3e-402d-9303-859b0ff0ac93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.588161 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.588190 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.588200 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2cs9\" (UniqueName: \"kubernetes.io/projected/117f36dc-cf3e-402d-9303-859b0ff0ac93-kube-api-access-t2cs9\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.588209 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/117f36dc-cf3e-402d-9303-859b0ff0ac93-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.588219 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.588228 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.606898 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-config-data" (OuterVolumeSpecName: "config-data") pod "117f36dc-cf3e-402d-9303-859b0ff0ac93" (UID: "117f36dc-cf3e-402d-9303-859b0ff0ac93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.689718 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117f36dc-cf3e-402d-9303-859b0ff0ac93-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.753726 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.766781 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.786316 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:32 crc kubenswrapper[4779]: E0320 15:46:32.786762 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="sg-core" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.786782 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="sg-core" Mar 20 15:46:32 crc kubenswrapper[4779]: E0320 15:46:32.786803 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="ceilometer-central-agent" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.786809 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="ceilometer-central-agent" Mar 20 15:46:32 crc kubenswrapper[4779]: E0320 15:46:32.786826 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="ceilometer-notification-agent" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.786832 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="ceilometer-notification-agent" Mar 20 15:46:32 crc kubenswrapper[4779]: E0320 15:46:32.786845 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="proxy-httpd" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.786852 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="proxy-httpd" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.787019 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="proxy-httpd" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.787033 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="ceilometer-central-agent" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.787052 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="ceilometer-notification-agent" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.787064 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" containerName="sg-core" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.788715 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.791894 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.794956 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.817619 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.892756 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phnt\" (UniqueName: \"kubernetes.io/projected/e0b1db76-6b60-4826-a1df-324e9a87d0d8-kube-api-access-8phnt\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.892999 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.893204 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.893285 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-config-data\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.893329 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.893344 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.893380 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-scripts\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.995672 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-scripts\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.995796 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phnt\" (UniqueName: \"kubernetes.io/projected/e0b1db76-6b60-4826-a1df-324e9a87d0d8-kube-api-access-8phnt\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.995887 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.995947 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.995992 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-config-data\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.996037 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.996057 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.996479 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:32 crc kubenswrapper[4779]: I0320 15:46:32.996943 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.001080 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-scripts\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.001533 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-config-data\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.001717 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.008774 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.014212 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phnt\" (UniqueName: \"kubernetes.io/projected/e0b1db76-6b60-4826-a1df-324e9a87d0d8-kube-api-access-8phnt\") pod \"ceilometer-0\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " pod="openstack/ceilometer-0" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.062804 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.063662 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.433431 4779 generic.go:334] "Generic (PLEG): container finished" podID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerID="d7fe244e02765ceb3be959118ca1a70a4ac93f217d62f041833169123707116a" exitCode=137 Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.433716 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f86b78896-7tfsm" event={"ID":"f6de60e1-9be1-4bb6-9d26-0b496117ed20","Type":"ContainerDied","Data":"d7fe244e02765ceb3be959118ca1a70a4ac93f217d62f041833169123707116a"} Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.588069 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.849279 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117f36dc-cf3e-402d-9303-859b0ff0ac93" path="/var/lib/kubelet/pods/117f36dc-cf3e-402d-9303-859b0ff0ac93/volumes" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.872370 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.952200 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-combined-ca-bundle\") pod \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.952261 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-secret-key\") pod \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.952321 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-config-data\") pod \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.952488 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-scripts\") pod \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.952523 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-tls-certs\") pod \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.952550 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6de60e1-9be1-4bb6-9d26-0b496117ed20-logs\") pod \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.952631 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt5mm\" (UniqueName: \"kubernetes.io/projected/f6de60e1-9be1-4bb6-9d26-0b496117ed20-kube-api-access-bt5mm\") pod \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\" (UID: \"f6de60e1-9be1-4bb6-9d26-0b496117ed20\") " Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.954498 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6de60e1-9be1-4bb6-9d26-0b496117ed20-logs" (OuterVolumeSpecName: "logs") pod "f6de60e1-9be1-4bb6-9d26-0b496117ed20" (UID: "f6de60e1-9be1-4bb6-9d26-0b496117ed20"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.959593 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f6de60e1-9be1-4bb6-9d26-0b496117ed20" (UID: "f6de60e1-9be1-4bb6-9d26-0b496117ed20"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.959600 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6de60e1-9be1-4bb6-9d26-0b496117ed20-kube-api-access-bt5mm" (OuterVolumeSpecName: "kube-api-access-bt5mm") pod "f6de60e1-9be1-4bb6-9d26-0b496117ed20" (UID: "f6de60e1-9be1-4bb6-9d26-0b496117ed20"). InnerVolumeSpecName "kube-api-access-bt5mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.984763 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-scripts" (OuterVolumeSpecName: "scripts") pod "f6de60e1-9be1-4bb6-9d26-0b496117ed20" (UID: "f6de60e1-9be1-4bb6-9d26-0b496117ed20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.991061 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6de60e1-9be1-4bb6-9d26-0b496117ed20" (UID: "f6de60e1-9be1-4bb6-9d26-0b496117ed20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:33 crc kubenswrapper[4779]: I0320 15:46:33.993270 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-config-data" (OuterVolumeSpecName: "config-data") pod "f6de60e1-9be1-4bb6-9d26-0b496117ed20" (UID: "f6de60e1-9be1-4bb6-9d26-0b496117ed20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.014302 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f6de60e1-9be1-4bb6-9d26-0b496117ed20" (UID: "f6de60e1-9be1-4bb6-9d26-0b496117ed20"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.055638 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.055668 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6de60e1-9be1-4bb6-9d26-0b496117ed20-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.055678 4779 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.055688 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6de60e1-9be1-4bb6-9d26-0b496117ed20-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.055698 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt5mm\" (UniqueName: \"kubernetes.io/projected/f6de60e1-9be1-4bb6-9d26-0b496117ed20-kube-api-access-bt5mm\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.055709 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.055717 4779 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6de60e1-9be1-4bb6-9d26-0b496117ed20-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.342503 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.342983 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="8b98145a-1310-4f32-9b9e-5824c7b793ef" containerName="watcher-decision-engine" containerID="cri-o://8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493" gracePeriod=30 Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.446185 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f86b78896-7tfsm" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.446170 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f86b78896-7tfsm" event={"ID":"f6de60e1-9be1-4bb6-9d26-0b496117ed20","Type":"ContainerDied","Data":"8fc84f413e5c90fa5207c13168c14ef390b461f2680891e04427e0d0de9c6b04"} Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.446254 4779 scope.go:117] "RemoveContainer" containerID="7ff747c74330698ccefe1a731244c171b039473c2090a2067e21822cd9b07277" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.452418 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerStarted","Data":"71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0"} Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.452458 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerStarted","Data":"10c3714a4c83446ae1d42e80234b957d5b918020e5cf81ec826b50d274e187e0"} Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.485961 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f86b78896-7tfsm"] Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.498211 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f86b78896-7tfsm"] Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.604570 4779 scope.go:117] "RemoveContainer" containerID="d7fe244e02765ceb3be959118ca1a70a4ac93f217d62f041833169123707116a" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.967356 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rx87f"] Mar 20 15:46:34 crc kubenswrapper[4779]: E0320 15:46:34.968057 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.968170 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" Mar 20 15:46:34 crc kubenswrapper[4779]: E0320 15:46:34.968299 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon-log" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.968383 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon-log" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.968726 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon-log" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.968833 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" containerName="horizon" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.969856 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:34 crc kubenswrapper[4779]: I0320 15:46:34.988344 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rx87f"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.041618 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-22qhc"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.047188 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.073973 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-operator-scripts\") pod \"nova-api-db-create-rx87f\" (UID: \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\") " pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.074142 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsmkc\" (UniqueName: \"kubernetes.io/projected/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-kube-api-access-wsmkc\") pod \"nova-api-db-create-rx87f\" (UID: \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\") " pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.075477 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0bdf-account-create-update-58t5l"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.076806 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.081764 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.098278 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-22qhc"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.112633 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0bdf-account-create-update-58t5l"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.175697 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-operator-scripts\") pod \"nova-api-0bdf-account-create-update-58t5l\" (UID: \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\") " pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.175765 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-operator-scripts\") pod \"nova-api-db-create-rx87f\" (UID: \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\") " pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.176042 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsmkc\" (UniqueName: \"kubernetes.io/projected/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-kube-api-access-wsmkc\") pod \"nova-api-db-create-rx87f\" (UID: \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\") " pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.176516 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0331a592-114a-4ab4-8c73-f01b650f50f4-operator-scripts\") pod \"nova-cell0-db-create-22qhc\" (UID: \"0331a592-114a-4ab4-8c73-f01b650f50f4\") " pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.176578 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsb8d\" (UniqueName: \"kubernetes.io/projected/0331a592-114a-4ab4-8c73-f01b650f50f4-kube-api-access-hsb8d\") pod \"nova-cell0-db-create-22qhc\" (UID: \"0331a592-114a-4ab4-8c73-f01b650f50f4\") " pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.176628 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqhc\" (UniqueName: \"kubernetes.io/projected/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-kube-api-access-rlqhc\") pod \"nova-api-0bdf-account-create-update-58t5l\" (UID: \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\") " pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.176659 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-operator-scripts\") pod \"nova-api-db-create-rx87f\" (UID: \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\") " pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.197827 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsmkc\" (UniqueName: \"kubernetes.io/projected/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-kube-api-access-wsmkc\") pod \"nova-api-db-create-rx87f\" (UID: \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\") " pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.265192 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-rlrzh"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.267705 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.279367 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0331a592-114a-4ab4-8c73-f01b650f50f4-operator-scripts\") pod \"nova-cell0-db-create-22qhc\" (UID: \"0331a592-114a-4ab4-8c73-f01b650f50f4\") " pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.279440 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsb8d\" (UniqueName: \"kubernetes.io/projected/0331a592-114a-4ab4-8c73-f01b650f50f4-kube-api-access-hsb8d\") pod \"nova-cell0-db-create-22qhc\" (UID: \"0331a592-114a-4ab4-8c73-f01b650f50f4\") " pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.279480 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqhc\" (UniqueName: \"kubernetes.io/projected/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-kube-api-access-rlqhc\") pod \"nova-api-0bdf-account-create-update-58t5l\" (UID: \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\") " pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.279547 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-operator-scripts\") pod \"nova-api-0bdf-account-create-update-58t5l\" (UID: \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\") " pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.280588 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0331a592-114a-4ab4-8c73-f01b650f50f4-operator-scripts\") pod \"nova-cell0-db-create-22qhc\" (UID: \"0331a592-114a-4ab4-8c73-f01b650f50f4\") " pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.281077 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-operator-scripts\") pod \"nova-api-0bdf-account-create-update-58t5l\" (UID: \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\") " pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.284487 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rlrzh"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.299376 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-016f-account-create-update-qpvhj"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.300664 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.303321 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.303839 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsb8d\" (UniqueName: \"kubernetes.io/projected/0331a592-114a-4ab4-8c73-f01b650f50f4-kube-api-access-hsb8d\") pod \"nova-cell0-db-create-22qhc\" (UID: \"0331a592-114a-4ab4-8c73-f01b650f50f4\") " pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.305593 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqhc\" (UniqueName: \"kubernetes.io/projected/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-kube-api-access-rlqhc\") pod \"nova-api-0bdf-account-create-update-58t5l\" (UID: \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\") " pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.322411 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-016f-account-create-update-qpvhj"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.381427 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cc2ea8-9931-4f65-9691-71bf32261ebb-operator-scripts\") pod \"nova-cell0-016f-account-create-update-qpvhj\" (UID: \"95cc2ea8-9931-4f65-9691-71bf32261ebb\") " pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.381537 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8ef970-d33e-4918-9b19-a0dc472f63f1-operator-scripts\") pod \"nova-cell1-db-create-rlrzh\" (UID: \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\") " pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.381596 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxq4p\" (UniqueName: \"kubernetes.io/projected/95cc2ea8-9931-4f65-9691-71bf32261ebb-kube-api-access-vxq4p\") pod \"nova-cell0-016f-account-create-update-qpvhj\" (UID: \"95cc2ea8-9931-4f65-9691-71bf32261ebb\") " pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.381624 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzkf4\" (UniqueName: \"kubernetes.io/projected/8d8ef970-d33e-4918-9b19-a0dc472f63f1-kube-api-access-fzkf4\") pod \"nova-cell1-db-create-rlrzh\" (UID: \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\") " pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.384722 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.418866 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.420645 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.473135 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d9cf-account-create-update-4rn7w"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.475212 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.475096 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerStarted","Data":"320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac"} Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.485021 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.494025 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8ef970-d33e-4918-9b19-a0dc472f63f1-operator-scripts\") pod \"nova-cell1-db-create-rlrzh\" (UID: \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\") " pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.494295 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxq4p\" (UniqueName: \"kubernetes.io/projected/95cc2ea8-9931-4f65-9691-71bf32261ebb-kube-api-access-vxq4p\") pod \"nova-cell0-016f-account-create-update-qpvhj\" (UID: \"95cc2ea8-9931-4f65-9691-71bf32261ebb\") " pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.494381 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzkf4\" (UniqueName: \"kubernetes.io/projected/8d8ef970-d33e-4918-9b19-a0dc472f63f1-kube-api-access-fzkf4\") pod \"nova-cell1-db-create-rlrzh\" (UID: \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\") " pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.494587 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cc2ea8-9931-4f65-9691-71bf32261ebb-operator-scripts\") pod \"nova-cell0-016f-account-create-update-qpvhj\" (UID: \"95cc2ea8-9931-4f65-9691-71bf32261ebb\") " pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.495492 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cc2ea8-9931-4f65-9691-71bf32261ebb-operator-scripts\") pod \"nova-cell0-016f-account-create-update-qpvhj\" (UID: \"95cc2ea8-9931-4f65-9691-71bf32261ebb\") " pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.496096 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8ef970-d33e-4918-9b19-a0dc472f63f1-operator-scripts\") pod \"nova-cell1-db-create-rlrzh\" (UID: \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\") " pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.531936 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxq4p\" (UniqueName: \"kubernetes.io/projected/95cc2ea8-9931-4f65-9691-71bf32261ebb-kube-api-access-vxq4p\") pod \"nova-cell0-016f-account-create-update-qpvhj\" (UID: \"95cc2ea8-9931-4f65-9691-71bf32261ebb\") " pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.542204 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d9cf-account-create-update-4rn7w"] Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.556710 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzkf4\" (UniqueName: \"kubernetes.io/projected/8d8ef970-d33e-4918-9b19-a0dc472f63f1-kube-api-access-fzkf4\") pod \"nova-cell1-db-create-rlrzh\" (UID: \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\") " pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.593040 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.596785 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gfdh\" (UniqueName: \"kubernetes.io/projected/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-kube-api-access-6gfdh\") pod \"nova-cell1-d9cf-account-create-update-4rn7w\" (UID: \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\") " pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.597087 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-operator-scripts\") pod \"nova-cell1-d9cf-account-create-update-4rn7w\" (UID: \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\") " pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.698671 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-operator-scripts\") pod \"nova-cell1-d9cf-account-create-update-4rn7w\" (UID: \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\") " pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.698819 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gfdh\" (UniqueName: \"kubernetes.io/projected/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-kube-api-access-6gfdh\") pod \"nova-cell1-d9cf-account-create-update-4rn7w\" (UID: \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\") " pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.700363 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-operator-scripts\") pod \"nova-cell1-d9cf-account-create-update-4rn7w\" (UID: \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\") " pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.717410 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.720999 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gfdh\" (UniqueName: \"kubernetes.io/projected/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-kube-api-access-6gfdh\") pod \"nova-cell1-d9cf-account-create-update-4rn7w\" (UID: \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\") " pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.860751 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6de60e1-9be1-4bb6-9d26-0b496117ed20" path="/var/lib/kubelet/pods/f6de60e1-9be1-4bb6-9d26-0b496117ed20/volumes" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.883767 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:35 crc kubenswrapper[4779]: I0320 15:46:35.904416 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0bdf-account-create-update-58t5l"] Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.170183 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rx87f"] Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.236543 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-22qhc"] Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.284337 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rlrzh"] Mar 20 15:46:36 crc kubenswrapper[4779]: W0320 15:46:36.302672 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d8ef970_d33e_4918_9b19_a0dc472f63f1.slice/crio-57d949007754235256ef59df0db570f75411f1c0f0673b1b689515bb1cf397b0 WatchSource:0}: Error finding container 57d949007754235256ef59df0db570f75411f1c0f0673b1b689515bb1cf397b0: Status 404 returned error can't find the container with id 57d949007754235256ef59df0db570f75411f1c0f0673b1b689515bb1cf397b0 Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.432011 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d9cf-account-create-update-4rn7w"] Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.462069 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-016f-account-create-update-qpvhj"] Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.573569 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rx87f" event={"ID":"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7","Type":"ContainerStarted","Data":"8ce29c19230d6dac3cc488f049a1c64f479fedd2d7e6dfd4923d33e75b62dd6b"} Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.587799 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerStarted","Data":"788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b"} Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.589775 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rlrzh" event={"ID":"8d8ef970-d33e-4918-9b19-a0dc472f63f1","Type":"ContainerStarted","Data":"57d949007754235256ef59df0db570f75411f1c0f0673b1b689515bb1cf397b0"} Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.608749 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-016f-account-create-update-qpvhj" event={"ID":"95cc2ea8-9931-4f65-9691-71bf32261ebb","Type":"ContainerStarted","Data":"c0f28335f080d028ba088f9acb3b3e7d04364aa7e1c11fa5e98ffb85d5fabcec"} Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.635387 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0bdf-account-create-update-58t5l" event={"ID":"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f","Type":"ContainerStarted","Data":"013d676dfc533185f0a66460e192ae1fb4c5d234181ba36c0fd2bae06aeb6c59"} Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.635431 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0bdf-account-create-update-58t5l" event={"ID":"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f","Type":"ContainerStarted","Data":"dadb92657234528f38fe11ad6b146d1b36f66ff24ecd8891f2e619bdbcb870e1"} Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.652874 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-22qhc" event={"ID":"0331a592-114a-4ab4-8c73-f01b650f50f4","Type":"ContainerStarted","Data":"366f8119034288e6b3018885b658d78e9437e8fb17854ea0d2334f110bf1c16f"} Mar 20 15:46:36 crc kubenswrapper[4779]: I0320 15:46:36.663121 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0bdf-account-create-update-58t5l" podStartSLOduration=1.663083066 podStartE2EDuration="1.663083066s" podCreationTimestamp="2026-03-20 15:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:36.654941834 +0000 UTC m=+1413.617457634" watchObservedRunningTime="2026-03-20 15:46:36.663083066 +0000 UTC m=+1413.625598866" Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.665459 4779 generic.go:334] "Generic (PLEG): container finished" podID="84fff0c4-e9cf-4459-a31e-c6dbb2f58096" containerID="c5930c8216183733a9a5dcfbda76408f7a45077793a538dd9b4f5acbedaf830b" exitCode=0 Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.665721 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" event={"ID":"84fff0c4-e9cf-4459-a31e-c6dbb2f58096","Type":"ContainerDied","Data":"c5930c8216183733a9a5dcfbda76408f7a45077793a538dd9b4f5acbedaf830b"} Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.665747 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" event={"ID":"84fff0c4-e9cf-4459-a31e-c6dbb2f58096","Type":"ContainerStarted","Data":"318ebc8d6344075924beaf55bdeebd15fe2fdd93eecc18816292912f136fc368"} Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.667882 4779 generic.go:334] "Generic (PLEG): container finished" podID="0331a592-114a-4ab4-8c73-f01b650f50f4" containerID="601bd00bd22831031142ba0f3e1204a8158fd9291e561783b6b5e1dda5b875ba" exitCode=0 Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.667930 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-22qhc" event={"ID":"0331a592-114a-4ab4-8c73-f01b650f50f4","Type":"ContainerDied","Data":"601bd00bd22831031142ba0f3e1204a8158fd9291e561783b6b5e1dda5b875ba"} Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.670131 4779 generic.go:334] "Generic (PLEG): container finished" podID="a7d867d7-f83c-43e1-ac12-4ebd4814b2b7" containerID="63e890f90cfcb8a844f2bbb13f038b655127b921c80b0676eb0cf362946ce15c" exitCode=0 Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.670171 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rx87f" event={"ID":"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7","Type":"ContainerDied","Data":"63e890f90cfcb8a844f2bbb13f038b655127b921c80b0676eb0cf362946ce15c"} Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.672178 4779 generic.go:334] "Generic (PLEG): container finished" podID="8d8ef970-d33e-4918-9b19-a0dc472f63f1" containerID="6eb9c7558338a2a8f9403b1cadb9a0568c465b9179578393cd5b53ac4c53cf15" exitCode=0 Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.672223 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rlrzh" event={"ID":"8d8ef970-d33e-4918-9b19-a0dc472f63f1","Type":"ContainerDied","Data":"6eb9c7558338a2a8f9403b1cadb9a0568c465b9179578393cd5b53ac4c53cf15"} Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.674839 4779 generic.go:334] "Generic (PLEG): container finished" podID="95cc2ea8-9931-4f65-9691-71bf32261ebb" containerID="cf4939e2333b84deca27326e90ca65eb7d4187f312440e079e535607e21cdb19" exitCode=0 Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.674894 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-016f-account-create-update-qpvhj" event={"ID":"95cc2ea8-9931-4f65-9691-71bf32261ebb","Type":"ContainerDied","Data":"cf4939e2333b84deca27326e90ca65eb7d4187f312440e079e535607e21cdb19"} Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.676439 4779 generic.go:334] "Generic (PLEG): container finished" podID="6fcb9b6b-ec11-4073-aef1-2ff8cf21545f" containerID="013d676dfc533185f0a66460e192ae1fb4c5d234181ba36c0fd2bae06aeb6c59" exitCode=0 Mar 20 15:46:37 crc kubenswrapper[4779]: I0320 15:46:37.676472 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0bdf-account-create-update-58t5l" event={"ID":"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f","Type":"ContainerDied","Data":"013d676dfc533185f0a66460e192ae1fb4c5d234181ba36c0fd2bae06aeb6c59"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.203511 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.274312 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlqhc\" (UniqueName: \"kubernetes.io/projected/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-kube-api-access-rlqhc\") pod \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\" (UID: \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.274578 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-operator-scripts\") pod \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\" (UID: \"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.275857 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fcb9b6b-ec11-4073-aef1-2ff8cf21545f" (UID: "6fcb9b6b-ec11-4073-aef1-2ff8cf21545f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.288493 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-kube-api-access-rlqhc" (OuterVolumeSpecName: "kube-api-access-rlqhc") pod "6fcb9b6b-ec11-4073-aef1-2ff8cf21545f" (UID: "6fcb9b6b-ec11-4073-aef1-2ff8cf21545f"). InnerVolumeSpecName "kube-api-access-rlqhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.293621 4779 scope.go:117] "RemoveContainer" containerID="429949b1be87c41cc93dc959fbb63b342cddf89af82c9cb86a7a4dd0323c8cc6" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.376898 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.376947 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlqhc\" (UniqueName: \"kubernetes.io/projected/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f-kube-api-access-rlqhc\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.406984 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.407353 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.410601 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.416625 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.428901 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478055 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxq4p\" (UniqueName: \"kubernetes.io/projected/95cc2ea8-9931-4f65-9691-71bf32261ebb-kube-api-access-vxq4p\") pod \"95cc2ea8-9931-4f65-9691-71bf32261ebb\" (UID: \"95cc2ea8-9931-4f65-9691-71bf32261ebb\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478100 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-operator-scripts\") pod \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\" (UID: \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478198 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8ef970-d33e-4918-9b19-a0dc472f63f1-operator-scripts\") pod \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\" (UID: \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478284 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzkf4\" (UniqueName: \"kubernetes.io/projected/8d8ef970-d33e-4918-9b19-a0dc472f63f1-kube-api-access-fzkf4\") pod \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\" (UID: \"8d8ef970-d33e-4918-9b19-a0dc472f63f1\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478343 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-operator-scripts\") pod \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\" (UID: \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478376 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsmkc\" (UniqueName: \"kubernetes.io/projected/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-kube-api-access-wsmkc\") pod \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\" (UID: \"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478399 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cc2ea8-9931-4f65-9691-71bf32261ebb-operator-scripts\") pod \"95cc2ea8-9931-4f65-9691-71bf32261ebb\" (UID: \"95cc2ea8-9931-4f65-9691-71bf32261ebb\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478445 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0331a592-114a-4ab4-8c73-f01b650f50f4-operator-scripts\") pod \"0331a592-114a-4ab4-8c73-f01b650f50f4\" (UID: \"0331a592-114a-4ab4-8c73-f01b650f50f4\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478490 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gfdh\" (UniqueName: \"kubernetes.io/projected/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-kube-api-access-6gfdh\") pod \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\" (UID: \"84fff0c4-e9cf-4459-a31e-c6dbb2f58096\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.478533 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsb8d\" (UniqueName: \"kubernetes.io/projected/0331a592-114a-4ab4-8c73-f01b650f50f4-kube-api-access-hsb8d\") pod \"0331a592-114a-4ab4-8c73-f01b650f50f4\" (UID: \"0331a592-114a-4ab4-8c73-f01b650f50f4\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.480669 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d8ef970-d33e-4918-9b19-a0dc472f63f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d8ef970-d33e-4918-9b19-a0dc472f63f1" (UID: "8d8ef970-d33e-4918-9b19-a0dc472f63f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.480902 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84fff0c4-e9cf-4459-a31e-c6dbb2f58096" (UID: "84fff0c4-e9cf-4459-a31e-c6dbb2f58096"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.481824 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7d867d7-f83c-43e1-ac12-4ebd4814b2b7" (UID: "a7d867d7-f83c-43e1-ac12-4ebd4814b2b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.482088 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95cc2ea8-9931-4f65-9691-71bf32261ebb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95cc2ea8-9931-4f65-9691-71bf32261ebb" (UID: "95cc2ea8-9931-4f65-9691-71bf32261ebb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.482416 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0331a592-114a-4ab4-8c73-f01b650f50f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0331a592-114a-4ab4-8c73-f01b650f50f4" (UID: "0331a592-114a-4ab4-8c73-f01b650f50f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.483958 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8ef970-d33e-4918-9b19-a0dc472f63f1-kube-api-access-fzkf4" (OuterVolumeSpecName: "kube-api-access-fzkf4") pod "8d8ef970-d33e-4918-9b19-a0dc472f63f1" (UID: "8d8ef970-d33e-4918-9b19-a0dc472f63f1"). InnerVolumeSpecName "kube-api-access-fzkf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.486383 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95cc2ea8-9931-4f65-9691-71bf32261ebb-kube-api-access-vxq4p" (OuterVolumeSpecName: "kube-api-access-vxq4p") pod "95cc2ea8-9931-4f65-9691-71bf32261ebb" (UID: "95cc2ea8-9931-4f65-9691-71bf32261ebb"). InnerVolumeSpecName "kube-api-access-vxq4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.488224 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0331a592-114a-4ab4-8c73-f01b650f50f4-kube-api-access-hsb8d" (OuterVolumeSpecName: "kube-api-access-hsb8d") pod "0331a592-114a-4ab4-8c73-f01b650f50f4" (UID: "0331a592-114a-4ab4-8c73-f01b650f50f4"). InnerVolumeSpecName "kube-api-access-hsb8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.490677 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-kube-api-access-6gfdh" (OuterVolumeSpecName: "kube-api-access-6gfdh") pod "84fff0c4-e9cf-4459-a31e-c6dbb2f58096" (UID: "84fff0c4-e9cf-4459-a31e-c6dbb2f58096"). InnerVolumeSpecName "kube-api-access-6gfdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.495973 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-kube-api-access-wsmkc" (OuterVolumeSpecName: "kube-api-access-wsmkc") pod "a7d867d7-f83c-43e1-ac12-4ebd4814b2b7" (UID: "a7d867d7-f83c-43e1-ac12-4ebd4814b2b7"). InnerVolumeSpecName "kube-api-access-wsmkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580301 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxq4p\" (UniqueName: \"kubernetes.io/projected/95cc2ea8-9931-4f65-9691-71bf32261ebb-kube-api-access-vxq4p\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580565 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580576 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8ef970-d33e-4918-9b19-a0dc472f63f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580587 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzkf4\" (UniqueName: \"kubernetes.io/projected/8d8ef970-d33e-4918-9b19-a0dc472f63f1-kube-api-access-fzkf4\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580599 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580607 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsmkc\" (UniqueName: \"kubernetes.io/projected/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7-kube-api-access-wsmkc\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580616 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95cc2ea8-9931-4f65-9691-71bf32261ebb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580624 4779 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0331a592-114a-4ab4-8c73-f01b650f50f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580632 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gfdh\" (UniqueName: \"kubernetes.io/projected/84fff0c4-e9cf-4459-a31e-c6dbb2f58096-kube-api-access-6gfdh\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.580640 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsb8d\" (UniqueName: \"kubernetes.io/projected/0331a592-114a-4ab4-8c73-f01b650f50f4-kube-api-access-hsb8d\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.635835 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.725646 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerStarted","Data":"43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.725734 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="ceilometer-central-agent" containerID="cri-o://71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0" gracePeriod=30 Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.725747 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="proxy-httpd" containerID="cri-o://43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f" gracePeriod=30 Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.725764 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.725814 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="sg-core" containerID="cri-o://788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b" gracePeriod=30 Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.725850 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="ceilometer-notification-agent" containerID="cri-o://320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac" gracePeriod=30 Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.727498 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" event={"ID":"84fff0c4-e9cf-4459-a31e-c6dbb2f58096","Type":"ContainerDied","Data":"318ebc8d6344075924beaf55bdeebd15fe2fdd93eecc18816292912f136fc368"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.727535 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="318ebc8d6344075924beaf55bdeebd15fe2fdd93eecc18816292912f136fc368" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.727583 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d9cf-account-create-update-4rn7w" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.744358 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rx87f" event={"ID":"a7d867d7-f83c-43e1-ac12-4ebd4814b2b7","Type":"ContainerDied","Data":"8ce29c19230d6dac3cc488f049a1c64f479fedd2d7e6dfd4923d33e75b62dd6b"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.744407 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ce29c19230d6dac3cc488f049a1c64f479fedd2d7e6dfd4923d33e75b62dd6b" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.744481 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rx87f" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.752538 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.374214949 podStartE2EDuration="7.752519405s" podCreationTimestamp="2026-03-20 15:46:32 +0000 UTC" firstStartedPulling="2026-03-20 15:46:33.622829652 +0000 UTC m=+1410.585345452" lastFinishedPulling="2026-03-20 15:46:39.001134108 +0000 UTC m=+1415.963649908" observedRunningTime="2026-03-20 15:46:39.747485431 +0000 UTC m=+1416.710001231" watchObservedRunningTime="2026-03-20 15:46:39.752519405 +0000 UTC m=+1416.715035205" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.763607 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rlrzh" event={"ID":"8d8ef970-d33e-4918-9b19-a0dc472f63f1","Type":"ContainerDied","Data":"57d949007754235256ef59df0db570f75411f1c0f0673b1b689515bb1cf397b0"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.763646 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57d949007754235256ef59df0db570f75411f1c0f0673b1b689515bb1cf397b0" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.763708 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rlrzh" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.774211 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-22qhc" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.779295 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-22qhc" event={"ID":"0331a592-114a-4ab4-8c73-f01b650f50f4","Type":"ContainerDied","Data":"366f8119034288e6b3018885b658d78e9437e8fb17854ea0d2334f110bf1c16f"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.779340 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="366f8119034288e6b3018885b658d78e9437e8fb17854ea0d2334f110bf1c16f" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.784972 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-combined-ca-bundle\") pod \"8b98145a-1310-4f32-9b9e-5824c7b793ef\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.785101 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b98145a-1310-4f32-9b9e-5824c7b793ef-logs\") pod \"8b98145a-1310-4f32-9b9e-5824c7b793ef\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.785258 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dll6k\" (UniqueName: \"kubernetes.io/projected/8b98145a-1310-4f32-9b9e-5824c7b793ef-kube-api-access-dll6k\") pod \"8b98145a-1310-4f32-9b9e-5824c7b793ef\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.785284 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-custom-prometheus-ca\") pod \"8b98145a-1310-4f32-9b9e-5824c7b793ef\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.785328 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-config-data\") pod \"8b98145a-1310-4f32-9b9e-5824c7b793ef\" (UID: \"8b98145a-1310-4f32-9b9e-5824c7b793ef\") " Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.789606 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b98145a-1310-4f32-9b9e-5824c7b793ef-kube-api-access-dll6k" (OuterVolumeSpecName: "kube-api-access-dll6k") pod "8b98145a-1310-4f32-9b9e-5824c7b793ef" (UID: "8b98145a-1310-4f32-9b9e-5824c7b793ef"). InnerVolumeSpecName "kube-api-access-dll6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.790855 4779 generic.go:334] "Generic (PLEG): container finished" podID="8b98145a-1310-4f32-9b9e-5824c7b793ef" containerID="8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493" exitCode=0 Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.790955 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8b98145a-1310-4f32-9b9e-5824c7b793ef","Type":"ContainerDied","Data":"8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.790984 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8b98145a-1310-4f32-9b9e-5824c7b793ef","Type":"ContainerDied","Data":"cd1a221d90db07970fa8743c8d44fe3a502da5828454bbc33dc75b91d1ae5370"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.791029 4779 scope.go:117] "RemoveContainer" containerID="8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.791431 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.791441 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b98145a-1310-4f32-9b9e-5824c7b793ef-logs" (OuterVolumeSpecName: "logs") pod "8b98145a-1310-4f32-9b9e-5824c7b793ef" (UID: "8b98145a-1310-4f32-9b9e-5824c7b793ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: E0320 15:46:39.802695 4779 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/watcher-decision-engine-0_openstack_watcher-decision-engine-8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493.log: no such file or directory" path="/var/log/containers/watcher-decision-engine-0_openstack_watcher-decision-engine-8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493.log" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.802835 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-016f-account-create-update-qpvhj" event={"ID":"95cc2ea8-9931-4f65-9691-71bf32261ebb","Type":"ContainerDied","Data":"c0f28335f080d028ba088f9acb3b3e7d04364aa7e1c11fa5e98ffb85d5fabcec"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.802860 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f28335f080d028ba088f9acb3b3e7d04364aa7e1c11fa5e98ffb85d5fabcec" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.802913 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-016f-account-create-update-qpvhj" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.822579 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0bdf-account-create-update-58t5l" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.822698 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b98145a-1310-4f32-9b9e-5824c7b793ef" (UID: "8b98145a-1310-4f32-9b9e-5824c7b793ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.824465 4779 scope.go:117] "RemoveContainer" containerID="8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493" Mar 20 15:46:39 crc kubenswrapper[4779]: E0320 15:46:39.826480 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493\": container with ID starting with 8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493 not found: ID does not exist" containerID="8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.826520 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493"} err="failed to get container status \"8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493\": rpc error: code = NotFound desc = could not find container \"8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493\": container with ID starting with 8420c552117767a297c8d519ea58d5d33c8eb963473ce4d1642e1f34a8abd493 not found: ID does not exist" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.845997 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8b98145a-1310-4f32-9b9e-5824c7b793ef" (UID: "8b98145a-1310-4f32-9b9e-5824c7b793ef"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.848592 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0bdf-account-create-update-58t5l" event={"ID":"6fcb9b6b-ec11-4073-aef1-2ff8cf21545f","Type":"ContainerDied","Data":"dadb92657234528f38fe11ad6b146d1b36f66ff24ecd8891f2e619bdbcb870e1"} Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.848708 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dadb92657234528f38fe11ad6b146d1b36f66ff24ecd8891f2e619bdbcb870e1" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.865639 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-config-data" (OuterVolumeSpecName: "config-data") pod "8b98145a-1310-4f32-9b9e-5824c7b793ef" (UID: "8b98145a-1310-4f32-9b9e-5824c7b793ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.889047 4779 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.889087 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.889102 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b98145a-1310-4f32-9b9e-5824c7b793ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.889132 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b98145a-1310-4f32-9b9e-5824c7b793ef-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:39 crc kubenswrapper[4779]: I0320 15:46:39.889152 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dll6k\" (UniqueName: \"kubernetes.io/projected/8b98145a-1310-4f32-9b9e-5824c7b793ef-kube-api-access-dll6k\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.168021 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.176616 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.190705 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:46:40 crc kubenswrapper[4779]: E0320 15:46:40.191154 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8ef970-d33e-4918-9b19-a0dc472f63f1" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191167 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8ef970-d33e-4918-9b19-a0dc472f63f1" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: E0320 15:46:40.191188 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0331a592-114a-4ab4-8c73-f01b650f50f4" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191195 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0331a592-114a-4ab4-8c73-f01b650f50f4" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: E0320 15:46:40.191210 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84fff0c4-e9cf-4459-a31e-c6dbb2f58096" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191215 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fff0c4-e9cf-4459-a31e-c6dbb2f58096" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: E0320 15:46:40.191226 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95cc2ea8-9931-4f65-9691-71bf32261ebb" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191232 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="95cc2ea8-9931-4f65-9691-71bf32261ebb" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: E0320 15:46:40.191241 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d867d7-f83c-43e1-ac12-4ebd4814b2b7" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191247 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d867d7-f83c-43e1-ac12-4ebd4814b2b7" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: E0320 15:46:40.191264 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcb9b6b-ec11-4073-aef1-2ff8cf21545f" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191270 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcb9b6b-ec11-4073-aef1-2ff8cf21545f" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: E0320 15:46:40.191286 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b98145a-1310-4f32-9b9e-5824c7b793ef" containerName="watcher-decision-engine" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191292 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b98145a-1310-4f32-9b9e-5824c7b793ef" containerName="watcher-decision-engine" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191442 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="84fff0c4-e9cf-4459-a31e-c6dbb2f58096" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191455 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="95cc2ea8-9931-4f65-9691-71bf32261ebb" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191466 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8ef970-d33e-4918-9b19-a0dc472f63f1" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191476 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b98145a-1310-4f32-9b9e-5824c7b793ef" containerName="watcher-decision-engine" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191484 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d867d7-f83c-43e1-ac12-4ebd4814b2b7" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191498 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fcb9b6b-ec11-4073-aef1-2ff8cf21545f" containerName="mariadb-account-create-update" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.191510 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0331a592-114a-4ab4-8c73-f01b650f50f4" containerName="mariadb-database-create" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.192083 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.195006 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.211914 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.296947 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.296990 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.297205 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257f7\" (UniqueName: \"kubernetes.io/projected/1ca673b5-8809-4a16-9534-12b58c0f3cb9-kube-api-access-257f7\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.297427 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.297711 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca673b5-8809-4a16-9534-12b58c0f3cb9-logs\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.399432 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257f7\" (UniqueName: \"kubernetes.io/projected/1ca673b5-8809-4a16-9534-12b58c0f3cb9-kube-api-access-257f7\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.399498 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.399589 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca673b5-8809-4a16-9534-12b58c0f3cb9-logs\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.399665 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.399686 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.400337 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca673b5-8809-4a16-9534-12b58c0f3cb9-logs\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.403671 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.403718 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.406689 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1ca673b5-8809-4a16-9534-12b58c0f3cb9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.423615 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257f7\" (UniqueName: \"kubernetes.io/projected/1ca673b5-8809-4a16-9534-12b58c0f3cb9-kube-api-access-257f7\") pod \"watcher-decision-engine-0\" (UID: \"1ca673b5-8809-4a16-9534-12b58c0f3cb9\") " pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.541331 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.837649 4779 generic.go:334] "Generic (PLEG): container finished" podID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerID="43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f" exitCode=0 Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.837955 4779 generic.go:334] "Generic (PLEG): container finished" podID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerID="788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b" exitCode=2 Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.837964 4779 generic.go:334] "Generic (PLEG): container finished" podID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerID="320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac" exitCode=0 Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.837982 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerDied","Data":"43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f"} Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.838023 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerDied","Data":"788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b"} Mar 20 15:46:40 crc kubenswrapper[4779]: I0320 15:46:40.838033 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerDied","Data":"320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac"} Mar 20 15:46:41 crc kubenswrapper[4779]: I0320 15:46:41.012569 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 20 15:46:41 crc kubenswrapper[4779]: I0320 15:46:41.825913 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b98145a-1310-4f32-9b9e-5824c7b793ef" path="/var/lib/kubelet/pods/8b98145a-1310-4f32-9b9e-5824c7b793ef/volumes" Mar 20 15:46:41 crc kubenswrapper[4779]: I0320 15:46:41.848152 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"1ca673b5-8809-4a16-9534-12b58c0f3cb9","Type":"ContainerStarted","Data":"eff04b3c4a3725e866468e9ed08b33b1d7193c427ce00966906e4f0d8c04f68a"} Mar 20 15:46:41 crc kubenswrapper[4779]: I0320 15:46:41.848202 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"1ca673b5-8809-4a16-9534-12b58c0f3cb9","Type":"ContainerStarted","Data":"451b94c6ca2904ddd64b56c0df2dcb6f77a92e552f4a65b32e901f6c4e409965"} Mar 20 15:46:41 crc kubenswrapper[4779]: I0320 15:46:41.875391 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=1.87536915 podStartE2EDuration="1.87536915s" podCreationTimestamp="2026-03-20 15:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:41.870785427 +0000 UTC m=+1418.833301227" watchObservedRunningTime="2026-03-20 15:46:41.87536915 +0000 UTC m=+1418.837884950" Mar 20 15:46:42 crc kubenswrapper[4779]: I0320 15:46:42.098987 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:46:42 crc kubenswrapper[4779]: I0320 15:46:42.099322 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="148e12da-09c4-454f-ac34-76059964f559" containerName="glance-log" containerID="cri-o://5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae" gracePeriod=30 Mar 20 15:46:42 crc kubenswrapper[4779]: I0320 15:46:42.099410 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="148e12da-09c4-454f-ac34-76059964f559" containerName="glance-httpd" containerID="cri-o://0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d" gracePeriod=30 Mar 20 15:46:42 crc kubenswrapper[4779]: I0320 15:46:42.857061 4779 generic.go:334] "Generic (PLEG): container finished" podID="148e12da-09c4-454f-ac34-76059964f559" containerID="5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae" exitCode=143 Mar 20 15:46:42 crc kubenswrapper[4779]: I0320 15:46:42.857147 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"148e12da-09c4-454f-ac34-76059964f559","Type":"ContainerDied","Data":"5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae"} Mar 20 15:46:42 crc kubenswrapper[4779]: I0320 15:46:42.945078 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:46:42 crc kubenswrapper[4779]: I0320 15:46:42.945411 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerName="glance-log" containerID="cri-o://bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7" gracePeriod=30 Mar 20 15:46:42 crc kubenswrapper[4779]: I0320 15:46:42.945493 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerName="glance-httpd" containerID="cri-o://4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089" gracePeriod=30 Mar 20 15:46:43 crc kubenswrapper[4779]: I0320 15:46:43.867175 4779 generic.go:334] "Generic (PLEG): container finished" podID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerID="bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7" exitCode=143 Mar 20 15:46:43 crc kubenswrapper[4779]: I0320 15:46:43.867419 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"205b1474-e402-4bd4-8ef7-4907c61f11bb","Type":"ContainerDied","Data":"bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7"} Mar 20 15:46:44 crc kubenswrapper[4779]: E0320 15:46:44.124344 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0b1db76_6b60_4826_a1df_324e9a87d0d8.slice/crio-71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.459601 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.587731 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-run-httpd\") pod \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.587822 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-config-data\") pod \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.587990 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8phnt\" (UniqueName: \"kubernetes.io/projected/e0b1db76-6b60-4826-a1df-324e9a87d0d8-kube-api-access-8phnt\") pod \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.588024 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-combined-ca-bundle\") pod \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.588073 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-log-httpd\") pod \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.588097 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-scripts\") pod \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.588131 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-sg-core-conf-yaml\") pod \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\" (UID: \"e0b1db76-6b60-4826-a1df-324e9a87d0d8\") " Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.589229 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e0b1db76-6b60-4826-a1df-324e9a87d0d8" (UID: "e0b1db76-6b60-4826-a1df-324e9a87d0d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.593292 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e0b1db76-6b60-4826-a1df-324e9a87d0d8" (UID: "e0b1db76-6b60-4826-a1df-324e9a87d0d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.606323 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-scripts" (OuterVolumeSpecName: "scripts") pod "e0b1db76-6b60-4826-a1df-324e9a87d0d8" (UID: "e0b1db76-6b60-4826-a1df-324e9a87d0d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.606439 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b1db76-6b60-4826-a1df-324e9a87d0d8-kube-api-access-8phnt" (OuterVolumeSpecName: "kube-api-access-8phnt") pod "e0b1db76-6b60-4826-a1df-324e9a87d0d8" (UID: "e0b1db76-6b60-4826-a1df-324e9a87d0d8"). InnerVolumeSpecName "kube-api-access-8phnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.690494 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.690728 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.690788 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b1db76-6b60-4826-a1df-324e9a87d0d8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.690878 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8phnt\" (UniqueName: \"kubernetes.io/projected/e0b1db76-6b60-4826-a1df-324e9a87d0d8-kube-api-access-8phnt\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.696273 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e0b1db76-6b60-4826-a1df-324e9a87d0d8" (UID: "e0b1db76-6b60-4826-a1df-324e9a87d0d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.786176 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-config-data" (OuterVolumeSpecName: "config-data") pod "e0b1db76-6b60-4826-a1df-324e9a87d0d8" (UID: "e0b1db76-6b60-4826-a1df-324e9a87d0d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.792684 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.792719 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.805150 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0b1db76-6b60-4826-a1df-324e9a87d0d8" (UID: "e0b1db76-6b60-4826-a1df-324e9a87d0d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.881662 4779 generic.go:334] "Generic (PLEG): container finished" podID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerID="71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0" exitCode=0 Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.881715 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerDied","Data":"71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0"} Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.881759 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0b1db76-6b60-4826-a1df-324e9a87d0d8","Type":"ContainerDied","Data":"10c3714a4c83446ae1d42e80234b957d5b918020e5cf81ec826b50d274e187e0"} Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.881782 4779 scope.go:117] "RemoveContainer" containerID="43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.881940 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.894200 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1db76-6b60-4826-a1df-324e9a87d0d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.919139 4779 scope.go:117] "RemoveContainer" containerID="788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.931012 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.940599 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.950374 4779 scope.go:117] "RemoveContainer" containerID="320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.950633 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:44 crc kubenswrapper[4779]: E0320 15:46:44.951008 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="proxy-httpd" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.951020 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="proxy-httpd" Mar 20 15:46:44 crc kubenswrapper[4779]: E0320 15:46:44.951039 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="ceilometer-notification-agent" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.951047 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="ceilometer-notification-agent" Mar 20 15:46:44 crc kubenswrapper[4779]: E0320 15:46:44.951059 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="sg-core" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.951066 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="sg-core" Mar 20 15:46:44 crc kubenswrapper[4779]: E0320 15:46:44.951084 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="ceilometer-central-agent" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.951090 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="ceilometer-central-agent" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.951265 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="ceilometer-notification-agent" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.951276 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="sg-core" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.951296 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="proxy-httpd" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.951312 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" containerName="ceilometer-central-agent" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.952831 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.955467 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.955666 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.974809 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.988798 4779 scope.go:117] "RemoveContainer" containerID="71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.995404 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-log-httpd\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.995499 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.995549 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-run-httpd\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.995571 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kks9\" (UniqueName: \"kubernetes.io/projected/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-kube-api-access-7kks9\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.995664 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.995805 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-scripts\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:44 crc kubenswrapper[4779]: I0320 15:46:44.995941 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-config-data\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.012266 4779 scope.go:117] "RemoveContainer" containerID="43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f" Mar 20 15:46:45 crc kubenswrapper[4779]: E0320 15:46:45.012753 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f\": container with ID starting with 43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f not found: ID does not exist" containerID="43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.012788 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f"} err="failed to get container status \"43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f\": rpc error: code = NotFound desc = could not find container \"43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f\": container with ID starting with 43216bc5238d818bc7642b320f24546f85a8131892b47f773166f2e6b5a7304f not found: ID does not exist" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.012810 4779 scope.go:117] "RemoveContainer" containerID="788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b" Mar 20 15:46:45 crc kubenswrapper[4779]: E0320 15:46:45.013136 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b\": container with ID starting with 788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b not found: ID does not exist" containerID="788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.013154 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b"} err="failed to get container status \"788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b\": rpc error: code = NotFound desc = could not find container \"788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b\": container with ID starting with 788cc135aa82f33fa9d818bf0421957d957d7dbc3c75bce0b00afb569f97d98b not found: ID does not exist" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.013167 4779 scope.go:117] "RemoveContainer" containerID="320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac" Mar 20 15:46:45 crc kubenswrapper[4779]: E0320 15:46:45.013462 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac\": container with ID starting with 320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac not found: ID does not exist" containerID="320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.013481 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac"} err="failed to get container status \"320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac\": rpc error: code = NotFound desc = could not find container \"320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac\": container with ID starting with 320aa125cc142fa54183f268da15eab35d6c582a4a96d63815ae03c6351ce1ac not found: ID does not exist" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.013492 4779 scope.go:117] "RemoveContainer" containerID="71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0" Mar 20 15:46:45 crc kubenswrapper[4779]: E0320 15:46:45.013653 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0\": container with ID starting with 71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0 not found: ID does not exist" containerID="71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.013666 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0"} err="failed to get container status \"71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0\": rpc error: code = NotFound desc = could not find container \"71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0\": container with ID starting with 71b78c74bfc7b9c74f537f9251d094d629483165e9c9b5b547eb018ea24281e0 not found: ID does not exist" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.097356 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kks9\" (UniqueName: \"kubernetes.io/projected/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-kube-api-access-7kks9\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.097718 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.097802 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-scripts\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.097880 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-config-data\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.097925 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-log-httpd\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.097988 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.098029 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-run-httpd\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.098435 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-run-httpd\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.099050 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-log-httpd\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.103167 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-config-data\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.103560 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-scripts\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.104526 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.104817 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.117811 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kks9\" (UniqueName: \"kubernetes.io/projected/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-kube-api-access-7kks9\") pod \"ceilometer-0\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.289302 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.693486 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5gc7l"] Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.703677 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.711353 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.712056 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.712150 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qfdgq" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.722426 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5gc7l"] Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.825989 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-scripts\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.826035 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.826091 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwtt\" (UniqueName: \"kubernetes.io/projected/003b9556-88db-4e20-8371-3863a2dd44b8-kube-api-access-nxwtt\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.826335 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-config-data\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.827616 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b1db76-6b60-4826-a1df-324e9a87d0d8" path="/var/lib/kubelet/pods/e0b1db76-6b60-4826-a1df-324e9a87d0d8/volumes" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.828660 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.899567 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.899649 4779 generic.go:334] "Generic (PLEG): container finished" podID="148e12da-09c4-454f-ac34-76059964f559" containerID="0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d" exitCode=0 Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.899724 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"148e12da-09c4-454f-ac34-76059964f559","Type":"ContainerDied","Data":"0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d"} Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.899749 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"148e12da-09c4-454f-ac34-76059964f559","Type":"ContainerDied","Data":"d207f62e11fbc0829c0495eb9879b3f555145ffd6da2c77252099e1548a362ec"} Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.899782 4779 scope.go:117] "RemoveContainer" containerID="0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.901816 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerStarted","Data":"eaec91f0d38aaa69a3762165b8676a2ed84c7bedca5845f024864eb2bcb062f1"} Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.927757 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-combined-ca-bundle\") pod \"148e12da-09c4-454f-ac34-76059964f559\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.928379 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-public-tls-certs\") pod \"148e12da-09c4-454f-ac34-76059964f559\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.928495 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-logs\") pod \"148e12da-09c4-454f-ac34-76059964f559\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.928693 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-httpd-run\") pod \"148e12da-09c4-454f-ac34-76059964f559\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.928841 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-scripts\") pod \"148e12da-09c4-454f-ac34-76059964f559\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.928965 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t264\" (UniqueName: \"kubernetes.io/projected/148e12da-09c4-454f-ac34-76059964f559-kube-api-access-9t264\") pod \"148e12da-09c4-454f-ac34-76059964f559\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.929148 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"148e12da-09c4-454f-ac34-76059964f559\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.929287 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-config-data\") pod \"148e12da-09c4-454f-ac34-76059964f559\" (UID: \"148e12da-09c4-454f-ac34-76059964f559\") " Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.929751 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-config-data\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.929996 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-scripts\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.930121 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.930288 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwtt\" (UniqueName: \"kubernetes.io/projected/003b9556-88db-4e20-8371-3863a2dd44b8-kube-api-access-nxwtt\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.929769 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "148e12da-09c4-454f-ac34-76059964f559" (UID: "148e12da-09c4-454f-ac34-76059964f559"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.935756 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-logs" (OuterVolumeSpecName: "logs") pod "148e12da-09c4-454f-ac34-76059964f559" (UID: "148e12da-09c4-454f-ac34-76059964f559"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.951516 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwtt\" (UniqueName: \"kubernetes.io/projected/003b9556-88db-4e20-8371-3863a2dd44b8-kube-api-access-nxwtt\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.955044 4779 scope.go:117] "RemoveContainer" containerID="5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.988539 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-config-data\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.994450 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-scripts\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.995258 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-scripts" (OuterVolumeSpecName: "scripts") pod "148e12da-09c4-454f-ac34-76059964f559" (UID: "148e12da-09c4-454f-ac34-76059964f559"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.998874 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148e12da-09c4-454f-ac34-76059964f559-kube-api-access-9t264" (OuterVolumeSpecName: "kube-api-access-9t264") pod "148e12da-09c4-454f-ac34-76059964f559" (UID: "148e12da-09c4-454f-ac34-76059964f559"). InnerVolumeSpecName "kube-api-access-9t264". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:45 crc kubenswrapper[4779]: I0320 15:46:45.997519 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5gc7l\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.004612 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "148e12da-09c4-454f-ac34-76059964f559" (UID: "148e12da-09c4-454f-ac34-76059964f559"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.018318 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "148e12da-09c4-454f-ac34-76059964f559" (UID: "148e12da-09c4-454f-ac34-76059964f559"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.022553 4779 scope.go:117] "RemoveContainer" containerID="0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d" Mar 20 15:46:46 crc kubenswrapper[4779]: E0320 15:46:46.023201 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d\": container with ID starting with 0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d not found: ID does not exist" containerID="0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.023275 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d"} err="failed to get container status \"0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d\": rpc error: code = NotFound desc = could not find container \"0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d\": container with ID starting with 0e53b2635bb3a8b9a6f6494d47b73c178cc104db6eaabdaab0ebb6fec6fe8b7d not found: ID does not exist" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.023311 4779 scope.go:117] "RemoveContainer" containerID="5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae" Mar 20 15:46:46 crc kubenswrapper[4779]: E0320 15:46:46.024041 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae\": container with ID starting with 5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae not found: ID does not exist" containerID="5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.024102 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae"} err="failed to get container status \"5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae\": rpc error: code = NotFound desc = could not find container \"5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae\": container with ID starting with 5276609715e8f5a0ee9a9cc97ad2905cfbe1495a360b9d4275b70f13104e41ae not found: ID does not exist" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.032408 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.032449 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.032461 4779 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/148e12da-09c4-454f-ac34-76059964f559-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.032471 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.032486 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t264\" (UniqueName: \"kubernetes.io/projected/148e12da-09c4-454f-ac34-76059964f559-kube-api-access-9t264\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.032525 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.033377 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "148e12da-09c4-454f-ac34-76059964f559" (UID: "148e12da-09c4-454f-ac34-76059964f559"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.043901 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.048067 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-config-data" (OuterVolumeSpecName: "config-data") pod "148e12da-09c4-454f-ac34-76059964f559" (UID: "148e12da-09c4-454f-ac34-76059964f559"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.082562 4779 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.134257 4779 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.134293 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.134303 4779 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/148e12da-09c4-454f-ac34-76059964f559-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.611248 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5gc7l"] Mar 20 15:46:46 crc kubenswrapper[4779]: W0320 15:46:46.628266 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod003b9556_88db_4e20_8371_3863a2dd44b8.slice/crio-80a33132f4749a8a0bf89498b273a0f5e58bc160d770e60f9c793b13fa705f5d WatchSource:0}: Error finding container 80a33132f4749a8a0bf89498b273a0f5e58bc160d770e60f9c793b13fa705f5d: Status 404 returned error can't find the container with id 80a33132f4749a8a0bf89498b273a0f5e58bc160d770e60f9c793b13fa705f5d Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.804265 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.848813 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-scripts\") pod \"205b1474-e402-4bd4-8ef7-4907c61f11bb\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.848866 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-httpd-run\") pod \"205b1474-e402-4bd4-8ef7-4907c61f11bb\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.848900 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-combined-ca-bundle\") pod \"205b1474-e402-4bd4-8ef7-4907c61f11bb\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.848983 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-internal-tls-certs\") pod \"205b1474-e402-4bd4-8ef7-4907c61f11bb\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.849004 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-logs\") pod \"205b1474-e402-4bd4-8ef7-4907c61f11bb\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.849022 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-config-data\") pod \"205b1474-e402-4bd4-8ef7-4907c61f11bb\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.849087 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"205b1474-e402-4bd4-8ef7-4907c61f11bb\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.849207 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsfcc\" (UniqueName: \"kubernetes.io/projected/205b1474-e402-4bd4-8ef7-4907c61f11bb-kube-api-access-xsfcc\") pod \"205b1474-e402-4bd4-8ef7-4907c61f11bb\" (UID: \"205b1474-e402-4bd4-8ef7-4907c61f11bb\") " Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.849545 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "205b1474-e402-4bd4-8ef7-4907c61f11bb" (UID: "205b1474-e402-4bd4-8ef7-4907c61f11bb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.849667 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-logs" (OuterVolumeSpecName: "logs") pod "205b1474-e402-4bd4-8ef7-4907c61f11bb" (UID: "205b1474-e402-4bd4-8ef7-4907c61f11bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.853419 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "205b1474-e402-4bd4-8ef7-4907c61f11bb" (UID: "205b1474-e402-4bd4-8ef7-4907c61f11bb"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.855335 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-scripts" (OuterVolumeSpecName: "scripts") pod "205b1474-e402-4bd4-8ef7-4907c61f11bb" (UID: "205b1474-e402-4bd4-8ef7-4907c61f11bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.860103 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205b1474-e402-4bd4-8ef7-4907c61f11bb-kube-api-access-xsfcc" (OuterVolumeSpecName: "kube-api-access-xsfcc") pod "205b1474-e402-4bd4-8ef7-4907c61f11bb" (UID: "205b1474-e402-4bd4-8ef7-4907c61f11bb"). InnerVolumeSpecName "kube-api-access-xsfcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.895097 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "205b1474-e402-4bd4-8ef7-4907c61f11bb" (UID: "205b1474-e402-4bd4-8ef7-4907c61f11bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.928978 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "205b1474-e402-4bd4-8ef7-4907c61f11bb" (UID: "205b1474-e402-4bd4-8ef7-4907c61f11bb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.930258 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-config-data" (OuterVolumeSpecName: "config-data") pod "205b1474-e402-4bd4-8ef7-4907c61f11bb" (UID: "205b1474-e402-4bd4-8ef7-4907c61f11bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.930367 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.933896 4779 generic.go:334] "Generic (PLEG): container finished" podID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerID="4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089" exitCode=0 Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.933961 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"205b1474-e402-4bd4-8ef7-4907c61f11bb","Type":"ContainerDied","Data":"4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089"} Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.933992 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"205b1474-e402-4bd4-8ef7-4907c61f11bb","Type":"ContainerDied","Data":"466971ea2fbd03d11f71d5cf66a01fec57a922cb325194c3f2f0cef1307a690f"} Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.934009 4779 scope.go:117] "RemoveContainer" containerID="4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.935466 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.945148 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" event={"ID":"003b9556-88db-4e20-8371-3863a2dd44b8","Type":"ContainerStarted","Data":"80a33132f4749a8a0bf89498b273a0f5e58bc160d770e60f9c793b13fa705f5d"} Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.950855 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerStarted","Data":"bab0353f2770d4c7469bf2f4cbbedf41fc29e56119697d544cd980eee322e924"} Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.972802 4779 scope.go:117] "RemoveContainer" containerID="bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.973780 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsfcc\" (UniqueName: \"kubernetes.io/projected/205b1474-e402-4bd4-8ef7-4907c61f11bb-kube-api-access-xsfcc\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.973915 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.974633 4779 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.974741 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.975337 4779 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.975451 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b1474-e402-4bd4-8ef7-4907c61f11bb-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.975530 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b1474-e402-4bd4-8ef7-4907c61f11bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:46 crc kubenswrapper[4779]: I0320 15:46:46.975641 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.011953 4779 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.042234 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.042280 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.061687 4779 scope.go:117] "RemoveContainer" containerID="4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.061901 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.062001 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:46:47 crc kubenswrapper[4779]: E0320 15:46:47.064977 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089\": container with ID starting with 4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089 not found: ID does not exist" containerID="4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.065042 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089"} err="failed to get container status \"4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089\": rpc error: code = NotFound desc = could not find container \"4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089\": container with ID starting with 4763085ccae1e2c32b3675d86358bd241fdc6c967c646f3f98f455a3e2baf089 not found: ID does not exist" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.065071 4779 scope.go:117] "RemoveContainer" containerID="bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7" Mar 20 15:46:47 crc kubenswrapper[4779]: E0320 15:46:47.066247 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7\": container with ID starting with bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7 not found: ID does not exist" containerID="bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.066342 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7"} err="failed to get container status \"bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7\": rpc error: code = NotFound desc = could not find container \"bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7\": container with ID starting with bb13bf5662540c3961b2fa319989459c2db8658e2ce6842ec39a55c18acc62e7 not found: ID does not exist" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.075173 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:46:47 crc kubenswrapper[4779]: E0320 15:46:47.075664 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148e12da-09c4-454f-ac34-76059964f559" containerName="glance-log" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.075688 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="148e12da-09c4-454f-ac34-76059964f559" containerName="glance-log" Mar 20 15:46:47 crc kubenswrapper[4779]: E0320 15:46:47.075732 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerName="glance-log" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.075741 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerName="glance-log" Mar 20 15:46:47 crc kubenswrapper[4779]: E0320 15:46:47.075759 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148e12da-09c4-454f-ac34-76059964f559" containerName="glance-httpd" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.075768 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="148e12da-09c4-454f-ac34-76059964f559" containerName="glance-httpd" Mar 20 15:46:47 crc kubenswrapper[4779]: E0320 15:46:47.075783 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerName="glance-httpd" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.075791 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerName="glance-httpd" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.076012 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerName="glance-log" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.076046 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="148e12da-09c4-454f-ac34-76059964f559" containerName="glance-httpd" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.076067 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="148e12da-09c4-454f-ac34-76059964f559" containerName="glance-log" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.076079 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" containerName="glance-httpd" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.077126 4779 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.077471 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.080012 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-pb8jv" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.080301 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.080415 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.080449 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.095174 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.115183 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.117288 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.119026 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.119521 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.126454 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.180071 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.180149 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.180172 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b801f31f-e923-4529-9d41-299903b3d167-logs\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.180187 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fllvn\" (UniqueName: \"kubernetes.io/projected/b801f31f-e923-4529-9d41-299903b3d167-kube-api-access-fllvn\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.180236 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b801f31f-e923-4529-9d41-299903b3d167-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.180336 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.180373 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.180414 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.282345 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.282404 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.282431 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.282612 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.282663 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.282712 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b801f31f-e923-4529-9d41-299903b3d167-logs\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.282779 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fllvn\" (UniqueName: \"kubernetes.io/projected/b801f31f-e923-4529-9d41-299903b3d167-kube-api-access-fllvn\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.283207 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.283268 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.283213 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b801f31f-e923-4529-9d41-299903b3d167-logs\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.283411 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b801f31f-e923-4529-9d41-299903b3d167-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.283746 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b801f31f-e923-4529-9d41-299903b3d167-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.283957 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba2590d6-15fb-4232-9bed-dc6a44e56241-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.284032 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sdsx\" (UniqueName: \"kubernetes.io/projected/ba2590d6-15fb-4232-9bed-dc6a44e56241-kube-api-access-8sdsx\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.284154 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.284229 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.284289 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.284327 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2590d6-15fb-4232-9bed-dc6a44e56241-logs\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.284619 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.288224 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.290610 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.298311 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.299400 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b801f31f-e923-4529-9d41-299903b3d167-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.313640 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fllvn\" (UniqueName: \"kubernetes.io/projected/b801f31f-e923-4529-9d41-299903b3d167-kube-api-access-fllvn\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.321377 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b801f31f-e923-4529-9d41-299903b3d167\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.386467 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.386569 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.387209 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.387241 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.387334 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba2590d6-15fb-4232-9bed-dc6a44e56241-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.387365 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sdsx\" (UniqueName: \"kubernetes.io/projected/ba2590d6-15fb-4232-9bed-dc6a44e56241-kube-api-access-8sdsx\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.387405 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.387435 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2590d6-15fb-4232-9bed-dc6a44e56241-logs\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.387782 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2590d6-15fb-4232-9bed-dc6a44e56241-logs\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.388147 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.388597 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba2590d6-15fb-4232-9bed-dc6a44e56241-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.396215 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.400210 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.405499 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sdsx\" (UniqueName: \"kubernetes.io/projected/ba2590d6-15fb-4232-9bed-dc6a44e56241-kube-api-access-8sdsx\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.415137 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.435002 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2590d6-15fb-4232-9bed-dc6a44e56241-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.474027 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.477685 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"ba2590d6-15fb-4232-9bed-dc6a44e56241\") " pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.485567 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.828836 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148e12da-09c4-454f-ac34-76059964f559" path="/var/lib/kubelet/pods/148e12da-09c4-454f-ac34-76059964f559/volumes" Mar 20 15:46:47 crc kubenswrapper[4779]: I0320 15:46:47.840063 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="205b1474-e402-4bd4-8ef7-4907c61f11bb" path="/var/lib/kubelet/pods/205b1474-e402-4bd4-8ef7-4907c61f11bb/volumes" Mar 20 15:46:48 crc kubenswrapper[4779]: I0320 15:46:48.010623 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerStarted","Data":"d5e6891dcc4fab93df5f8c11b11c0cdacc111b3efa6f0e85eacab93413919bef"} Mar 20 15:46:48 crc kubenswrapper[4779]: I0320 15:46:48.295573 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:46:48 crc kubenswrapper[4779]: I0320 15:46:48.526769 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:49 crc kubenswrapper[4779]: I0320 15:46:49.029015 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b801f31f-e923-4529-9d41-299903b3d167","Type":"ContainerStarted","Data":"5f61e7db09ccfa23b8d0dd7ff905994bd95b73eff41572bccc3213d0949fcf47"} Mar 20 15:46:49 crc kubenswrapper[4779]: I0320 15:46:49.037993 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerStarted","Data":"4f8ea9c917fcf714b98aefc6521dd8daccb2d5608e0e5ae32093730a7bfe7a0d"} Mar 20 15:46:49 crc kubenswrapper[4779]: I0320 15:46:49.040689 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:46:49 crc kubenswrapper[4779]: W0320 15:46:49.044269 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba2590d6_15fb_4232_9bed_dc6a44e56241.slice/crio-f0151d212269491829f0c089804e8b1dfd4e82a67502e43c9ff3c060d0486c11 WatchSource:0}: Error finding container f0151d212269491829f0c089804e8b1dfd4e82a67502e43c9ff3c060d0486c11: Status 404 returned error can't find the container with id f0151d212269491829f0c089804e8b1dfd4e82a67502e43c9ff3c060d0486c11 Mar 20 15:46:50 crc kubenswrapper[4779]: I0320 15:46:50.054512 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b801f31f-e923-4529-9d41-299903b3d167","Type":"ContainerStarted","Data":"9677dd02519f7a65ac292553187569b694600d404fd28a556cb02f905f33efbc"} Mar 20 15:46:50 crc kubenswrapper[4779]: I0320 15:46:50.054849 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b801f31f-e923-4529-9d41-299903b3d167","Type":"ContainerStarted","Data":"b94e7086ab66dcad7f0c55a6d812f73e27ff59fadb010cca981a6fee78aa700c"} Mar 20 15:46:50 crc kubenswrapper[4779]: I0320 15:46:50.059937 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba2590d6-15fb-4232-9bed-dc6a44e56241","Type":"ContainerStarted","Data":"0b7721c57f10cf974b4670c25ac5812f709fa99090441304e65e7ec7819a502a"} Mar 20 15:46:50 crc kubenswrapper[4779]: I0320 15:46:50.059975 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba2590d6-15fb-4232-9bed-dc6a44e56241","Type":"ContainerStarted","Data":"f0151d212269491829f0c089804e8b1dfd4e82a67502e43c9ff3c060d0486c11"} Mar 20 15:46:50 crc kubenswrapper[4779]: I0320 15:46:50.541829 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 20 15:46:50 crc kubenswrapper[4779]: I0320 15:46:50.575865 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 20 15:46:50 crc kubenswrapper[4779]: I0320 15:46:50.602994 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.602973279 podStartE2EDuration="4.602973279s" podCreationTimestamp="2026-03-20 15:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:50.084546482 +0000 UTC m=+1427.047062292" watchObservedRunningTime="2026-03-20 15:46:50.602973279 +0000 UTC m=+1427.565489079" Mar 20 15:46:51 crc kubenswrapper[4779]: I0320 15:46:51.105817 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba2590d6-15fb-4232-9bed-dc6a44e56241","Type":"ContainerStarted","Data":"b5ef8e002121c2369cace37ebfaa915f67c9088ae24e6196357d17c3ad021415"} Mar 20 15:46:51 crc kubenswrapper[4779]: I0320 15:46:51.106552 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 20 15:46:51 crc kubenswrapper[4779]: I0320 15:46:51.139120 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.139091843 podStartE2EDuration="4.139091843s" podCreationTimestamp="2026-03-20 15:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:51.129399959 +0000 UTC m=+1428.091915759" watchObservedRunningTime="2026-03-20 15:46:51.139091843 +0000 UTC m=+1428.101607643" Mar 20 15:46:51 crc kubenswrapper[4779]: I0320 15:46:51.163553 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 20 15:46:52 crc kubenswrapper[4779]: I0320 15:46:52.118078 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerStarted","Data":"84ece330cbaa48123693e979c1545064e124cc78d8f28d79690d35cc64f1c195"} Mar 20 15:46:52 crc kubenswrapper[4779]: I0320 15:46:52.118790 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:46:52 crc kubenswrapper[4779]: I0320 15:46:52.118294 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="ceilometer-central-agent" containerID="cri-o://bab0353f2770d4c7469bf2f4cbbedf41fc29e56119697d544cd980eee322e924" gracePeriod=30 Mar 20 15:46:52 crc kubenswrapper[4779]: I0320 15:46:52.119011 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="ceilometer-notification-agent" containerID="cri-o://d5e6891dcc4fab93df5f8c11b11c0cdacc111b3efa6f0e85eacab93413919bef" gracePeriod=30 Mar 20 15:46:52 crc kubenswrapper[4779]: I0320 15:46:52.119074 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="sg-core" containerID="cri-o://4f8ea9c917fcf714b98aefc6521dd8daccb2d5608e0e5ae32093730a7bfe7a0d" gracePeriod=30 Mar 20 15:46:52 crc kubenswrapper[4779]: I0320 15:46:52.119733 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="proxy-httpd" containerID="cri-o://84ece330cbaa48123693e979c1545064e124cc78d8f28d79690d35cc64f1c195" gracePeriod=30 Mar 20 15:46:52 crc kubenswrapper[4779]: I0320 15:46:52.154229 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.979455684 podStartE2EDuration="8.154212122s" podCreationTimestamp="2026-03-20 15:46:44 +0000 UTC" firstStartedPulling="2026-03-20 15:46:45.831612268 +0000 UTC m=+1422.794128068" lastFinishedPulling="2026-03-20 15:46:51.006368706 +0000 UTC m=+1427.968884506" observedRunningTime="2026-03-20 15:46:52.147156841 +0000 UTC m=+1429.109672651" watchObservedRunningTime="2026-03-20 15:46:52.154212122 +0000 UTC m=+1429.116727922" Mar 20 15:46:53 crc kubenswrapper[4779]: I0320 15:46:53.150512 4779 generic.go:334] "Generic (PLEG): container finished" podID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerID="84ece330cbaa48123693e979c1545064e124cc78d8f28d79690d35cc64f1c195" exitCode=0 Mar 20 15:46:53 crc kubenswrapper[4779]: I0320 15:46:53.150843 4779 generic.go:334] "Generic (PLEG): container finished" podID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerID="4f8ea9c917fcf714b98aefc6521dd8daccb2d5608e0e5ae32093730a7bfe7a0d" exitCode=2 Mar 20 15:46:53 crc kubenswrapper[4779]: I0320 15:46:53.150854 4779 generic.go:334] "Generic (PLEG): container finished" podID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerID="d5e6891dcc4fab93df5f8c11b11c0cdacc111b3efa6f0e85eacab93413919bef" exitCode=0 Mar 20 15:46:53 crc kubenswrapper[4779]: I0320 15:46:53.150747 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerDied","Data":"84ece330cbaa48123693e979c1545064e124cc78d8f28d79690d35cc64f1c195"} Mar 20 15:46:53 crc kubenswrapper[4779]: I0320 15:46:53.150923 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerDied","Data":"4f8ea9c917fcf714b98aefc6521dd8daccb2d5608e0e5ae32093730a7bfe7a0d"} Mar 20 15:46:53 crc kubenswrapper[4779]: I0320 15:46:53.150940 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerDied","Data":"d5e6891dcc4fab93df5f8c11b11c0cdacc111b3efa6f0e85eacab93413919bef"} Mar 20 15:46:55 crc kubenswrapper[4779]: I0320 15:46:55.150228 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:46:55 crc kubenswrapper[4779]: I0320 15:46:55.150289 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.182846 4779 generic.go:334] "Generic (PLEG): container finished" podID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerID="bab0353f2770d4c7469bf2f4cbbedf41fc29e56119697d544cd980eee322e924" exitCode=0 Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.182899 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerDied","Data":"bab0353f2770d4c7469bf2f4cbbedf41fc29e56119697d544cd980eee322e924"} Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.661975 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.682849 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-config-data\") pod \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.682915 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kks9\" (UniqueName: \"kubernetes.io/projected/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-kube-api-access-7kks9\") pod \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.682958 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-combined-ca-bundle\") pod \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.683009 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-scripts\") pod \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.683081 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-sg-core-conf-yaml\") pod \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.683177 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-run-httpd\") pod \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.683228 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-log-httpd\") pod \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\" (UID: \"d45d95dc-5b6b-4724-ae68-094d9e6b54a4\") " Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.684449 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d45d95dc-5b6b-4724-ae68-094d9e6b54a4" (UID: "d45d95dc-5b6b-4724-ae68-094d9e6b54a4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.685574 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d45d95dc-5b6b-4724-ae68-094d9e6b54a4" (UID: "d45d95dc-5b6b-4724-ae68-094d9e6b54a4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.693190 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-scripts" (OuterVolumeSpecName: "scripts") pod "d45d95dc-5b6b-4724-ae68-094d9e6b54a4" (UID: "d45d95dc-5b6b-4724-ae68-094d9e6b54a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.695053 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-kube-api-access-7kks9" (OuterVolumeSpecName: "kube-api-access-7kks9") pod "d45d95dc-5b6b-4724-ae68-094d9e6b54a4" (UID: "d45d95dc-5b6b-4724-ae68-094d9e6b54a4"). InnerVolumeSpecName "kube-api-access-7kks9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.722981 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d45d95dc-5b6b-4724-ae68-094d9e6b54a4" (UID: "d45d95dc-5b6b-4724-ae68-094d9e6b54a4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.773279 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d45d95dc-5b6b-4724-ae68-094d9e6b54a4" (UID: "d45d95dc-5b6b-4724-ae68-094d9e6b54a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.784691 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kks9\" (UniqueName: \"kubernetes.io/projected/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-kube-api-access-7kks9\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.784719 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.784728 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.784736 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.784745 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.784754 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.801289 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-config-data" (OuterVolumeSpecName: "config-data") pod "d45d95dc-5b6b-4724-ae68-094d9e6b54a4" (UID: "d45d95dc-5b6b-4724-ae68-094d9e6b54a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:46:56 crc kubenswrapper[4779]: I0320 15:46:56.886081 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d45d95dc-5b6b-4724-ae68-094d9e6b54a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.195318 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" event={"ID":"003b9556-88db-4e20-8371-3863a2dd44b8","Type":"ContainerStarted","Data":"6d5eb0a2ade93dca914e2a3209548039f10656ad01e8f3b622f6f1d36865e42c"} Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.198620 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d45d95dc-5b6b-4724-ae68-094d9e6b54a4","Type":"ContainerDied","Data":"eaec91f0d38aaa69a3762165b8676a2ed84c7bedca5845f024864eb2bcb062f1"} Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.198665 4779 scope.go:117] "RemoveContainer" containerID="84ece330cbaa48123693e979c1545064e124cc78d8f28d79690d35cc64f1c195" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.198689 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.224781 4779 scope.go:117] "RemoveContainer" containerID="4f8ea9c917fcf714b98aefc6521dd8daccb2d5608e0e5ae32093730a7bfe7a0d" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.229321 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" podStartSLOduration=2.411579873 podStartE2EDuration="12.229299813s" podCreationTimestamp="2026-03-20 15:46:45 +0000 UTC" firstStartedPulling="2026-03-20 15:46:46.634925327 +0000 UTC m=+1423.597441127" lastFinishedPulling="2026-03-20 15:46:56.452645277 +0000 UTC m=+1433.415161067" observedRunningTime="2026-03-20 15:46:57.207501026 +0000 UTC m=+1434.170016826" watchObservedRunningTime="2026-03-20 15:46:57.229299813 +0000 UTC m=+1434.191815613" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.240337 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.251197 4779 scope.go:117] "RemoveContainer" containerID="d5e6891dcc4fab93df5f8c11b11c0cdacc111b3efa6f0e85eacab93413919bef" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.265874 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.276691 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:57 crc kubenswrapper[4779]: E0320 15:46:57.277031 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="ceilometer-central-agent" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.277047 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="ceilometer-central-agent" Mar 20 15:46:57 crc kubenswrapper[4779]: E0320 15:46:57.277081 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="ceilometer-notification-agent" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.277088 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="ceilometer-notification-agent" Mar 20 15:46:57 crc kubenswrapper[4779]: E0320 15:46:57.277097 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="proxy-httpd" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.277103 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="proxy-httpd" Mar 20 15:46:57 crc kubenswrapper[4779]: E0320 15:46:57.277130 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="sg-core" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.277136 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="sg-core" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.277313 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="sg-core" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.277324 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="proxy-httpd" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.277335 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="ceilometer-central-agent" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.277358 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" containerName="ceilometer-notification-agent" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.278947 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.281494 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.281691 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.292211 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.312193 4779 scope.go:117] "RemoveContainer" containerID="bab0353f2770d4c7469bf2f4cbbedf41fc29e56119697d544cd980eee322e924" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.400782 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-run-httpd\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.400831 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.400925 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-log-httpd\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.400942 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm2ph\" (UniqueName: \"kubernetes.io/projected/64980a81-ec28-4991-924c-b47e95d3fd2e-kube-api-access-sm2ph\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.401370 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-config-data\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.401418 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-scripts\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.401576 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.475478 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.475529 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.487035 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.487073 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.503465 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-log-httpd\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.503504 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm2ph\" (UniqueName: \"kubernetes.io/projected/64980a81-ec28-4991-924c-b47e95d3fd2e-kube-api-access-sm2ph\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.503538 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-config-data\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.503568 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-scripts\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.503599 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.503691 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-run-httpd\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.503721 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.505856 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-log-httpd\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.506360 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-run-httpd\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.510203 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.511839 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-scripts\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.512407 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.518107 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.523605 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm2ph\" (UniqueName: \"kubernetes.io/projected/64980a81-ec28-4991-924c-b47e95d3fd2e-kube-api-access-sm2ph\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.523882 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-config-data\") pod \"ceilometer-0\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.531684 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.543141 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.543263 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.595508 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:46:57 crc kubenswrapper[4779]: I0320 15:46:57.817998 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45d95dc-5b6b-4724-ae68-094d9e6b54a4" path="/var/lib/kubelet/pods/d45d95dc-5b6b-4724-ae68-094d9e6b54a4/volumes" Mar 20 15:46:58 crc kubenswrapper[4779]: I0320 15:46:58.114535 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:46:58 crc kubenswrapper[4779]: I0320 15:46:58.208970 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerStarted","Data":"468935705afaa24fbce74ec2e86c1a483312263fb03fe7917706b64b210b9326"} Mar 20 15:46:58 crc kubenswrapper[4779]: I0320 15:46:58.209228 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:58 crc kubenswrapper[4779]: I0320 15:46:58.209256 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 15:46:58 crc kubenswrapper[4779]: I0320 15:46:58.209270 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 15:46:58 crc kubenswrapper[4779]: I0320 15:46:58.209282 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 15:46:59 crc kubenswrapper[4779]: I0320 15:46:59.218974 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerStarted","Data":"e29cb9ade5d141418ca4cf96c71b910db359421b7e5f16c983fc8a236df516db"} Mar 20 15:47:00 crc kubenswrapper[4779]: I0320 15:47:00.156031 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 15:47:00 crc kubenswrapper[4779]: I0320 15:47:00.239507 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:47:00 crc kubenswrapper[4779]: I0320 15:47:00.239488 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerStarted","Data":"00b74be011bcf522ab460330ede2e52938a7476879db59b96a2d43702c76fbe0"} Mar 20 15:47:00 crc kubenswrapper[4779]: I0320 15:47:00.331401 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 15:47:00 crc kubenswrapper[4779]: I0320 15:47:00.331484 4779 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:47:00 crc kubenswrapper[4779]: I0320 15:47:00.334813 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 15:47:00 crc kubenswrapper[4779]: I0320 15:47:00.384332 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 15:47:01 crc kubenswrapper[4779]: I0320 15:47:01.259391 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerStarted","Data":"8692e4e9961696eda85185d923fc97ed86ec7a5f13c5a8c7f13c503b1123ae33"} Mar 20 15:47:01 crc kubenswrapper[4779]: I0320 15:47:01.576411 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:47:02 crc kubenswrapper[4779]: I0320 15:47:02.271158 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerStarted","Data":"51386384fcec7d0366ed3b260c41104c17bf2c7e95563503d509ec6be330d132"} Mar 20 15:47:02 crc kubenswrapper[4779]: I0320 15:47:02.271474 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="ceilometer-central-agent" containerID="cri-o://e29cb9ade5d141418ca4cf96c71b910db359421b7e5f16c983fc8a236df516db" gracePeriod=30 Mar 20 15:47:02 crc kubenswrapper[4779]: I0320 15:47:02.271505 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:47:02 crc kubenswrapper[4779]: I0320 15:47:02.271636 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="proxy-httpd" containerID="cri-o://51386384fcec7d0366ed3b260c41104c17bf2c7e95563503d509ec6be330d132" gracePeriod=30 Mar 20 15:47:02 crc kubenswrapper[4779]: I0320 15:47:02.271693 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="sg-core" containerID="cri-o://8692e4e9961696eda85185d923fc97ed86ec7a5f13c5a8c7f13c503b1123ae33" gracePeriod=30 Mar 20 15:47:02 crc kubenswrapper[4779]: I0320 15:47:02.271739 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="ceilometer-notification-agent" containerID="cri-o://00b74be011bcf522ab460330ede2e52938a7476879db59b96a2d43702c76fbe0" gracePeriod=30 Mar 20 15:47:02 crc kubenswrapper[4779]: I0320 15:47:02.305620 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.498744739 podStartE2EDuration="5.305597864s" podCreationTimestamp="2026-03-20 15:46:57 +0000 UTC" firstStartedPulling="2026-03-20 15:46:58.116380219 +0000 UTC m=+1435.078896019" lastFinishedPulling="2026-03-20 15:47:01.923233344 +0000 UTC m=+1438.885749144" observedRunningTime="2026-03-20 15:47:02.29757793 +0000 UTC m=+1439.260093730" watchObservedRunningTime="2026-03-20 15:47:02.305597864 +0000 UTC m=+1439.268113664" Mar 20 15:47:03 crc kubenswrapper[4779]: I0320 15:47:03.282917 4779 generic.go:334] "Generic (PLEG): container finished" podID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerID="8692e4e9961696eda85185d923fc97ed86ec7a5f13c5a8c7f13c503b1123ae33" exitCode=2 Mar 20 15:47:03 crc kubenswrapper[4779]: I0320 15:47:03.282947 4779 generic.go:334] "Generic (PLEG): container finished" podID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerID="00b74be011bcf522ab460330ede2e52938a7476879db59b96a2d43702c76fbe0" exitCode=0 Mar 20 15:47:03 crc kubenswrapper[4779]: I0320 15:47:03.283643 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerDied","Data":"8692e4e9961696eda85185d923fc97ed86ec7a5f13c5a8c7f13c503b1123ae33"} Mar 20 15:47:03 crc kubenswrapper[4779]: I0320 15:47:03.283747 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerDied","Data":"00b74be011bcf522ab460330ede2e52938a7476879db59b96a2d43702c76fbe0"} Mar 20 15:47:06 crc kubenswrapper[4779]: I0320 15:47:06.341659 4779 generic.go:334] "Generic (PLEG): container finished" podID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerID="e29cb9ade5d141418ca4cf96c71b910db359421b7e5f16c983fc8a236df516db" exitCode=0 Mar 20 15:47:06 crc kubenswrapper[4779]: I0320 15:47:06.341752 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerDied","Data":"e29cb9ade5d141418ca4cf96c71b910db359421b7e5f16c983fc8a236df516db"} Mar 20 15:47:07 crc kubenswrapper[4779]: I0320 15:47:07.352404 4779 generic.go:334] "Generic (PLEG): container finished" podID="003b9556-88db-4e20-8371-3863a2dd44b8" containerID="6d5eb0a2ade93dca914e2a3209548039f10656ad01e8f3b622f6f1d36865e42c" exitCode=0 Mar 20 15:47:07 crc kubenswrapper[4779]: I0320 15:47:07.352491 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" event={"ID":"003b9556-88db-4e20-8371-3863a2dd44b8","Type":"ContainerDied","Data":"6d5eb0a2ade93dca914e2a3209548039f10656ad01e8f3b622f6f1d36865e42c"} Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.745376 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.820366 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwtt\" (UniqueName: \"kubernetes.io/projected/003b9556-88db-4e20-8371-3863a2dd44b8-kube-api-access-nxwtt\") pod \"003b9556-88db-4e20-8371-3863a2dd44b8\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.820527 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-config-data\") pod \"003b9556-88db-4e20-8371-3863a2dd44b8\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.820557 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-combined-ca-bundle\") pod \"003b9556-88db-4e20-8371-3863a2dd44b8\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.820629 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-scripts\") pod \"003b9556-88db-4e20-8371-3863a2dd44b8\" (UID: \"003b9556-88db-4e20-8371-3863a2dd44b8\") " Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.826184 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-scripts" (OuterVolumeSpecName: "scripts") pod "003b9556-88db-4e20-8371-3863a2dd44b8" (UID: "003b9556-88db-4e20-8371-3863a2dd44b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.831171 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003b9556-88db-4e20-8371-3863a2dd44b8-kube-api-access-nxwtt" (OuterVolumeSpecName: "kube-api-access-nxwtt") pod "003b9556-88db-4e20-8371-3863a2dd44b8" (UID: "003b9556-88db-4e20-8371-3863a2dd44b8"). InnerVolumeSpecName "kube-api-access-nxwtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.847977 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-config-data" (OuterVolumeSpecName: "config-data") pod "003b9556-88db-4e20-8371-3863a2dd44b8" (UID: "003b9556-88db-4e20-8371-3863a2dd44b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.849085 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "003b9556-88db-4e20-8371-3863a2dd44b8" (UID: "003b9556-88db-4e20-8371-3863a2dd44b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.923330 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxwtt\" (UniqueName: \"kubernetes.io/projected/003b9556-88db-4e20-8371-3863a2dd44b8-kube-api-access-nxwtt\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.923368 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.923380 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:08 crc kubenswrapper[4779]: I0320 15:47:08.923391 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003b9556-88db-4e20-8371-3863a2dd44b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.396560 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" event={"ID":"003b9556-88db-4e20-8371-3863a2dd44b8","Type":"ContainerDied","Data":"80a33132f4749a8a0bf89498b273a0f5e58bc160d770e60f9c793b13fa705f5d"} Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.396611 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80a33132f4749a8a0bf89498b273a0f5e58bc160d770e60f9c793b13fa705f5d" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.396675 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5gc7l" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.474825 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 15:47:09 crc kubenswrapper[4779]: E0320 15:47:09.475304 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003b9556-88db-4e20-8371-3863a2dd44b8" containerName="nova-cell0-conductor-db-sync" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.475326 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="003b9556-88db-4e20-8371-3863a2dd44b8" containerName="nova-cell0-conductor-db-sync" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.475590 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="003b9556-88db-4e20-8371-3863a2dd44b8" containerName="nova-cell0-conductor-db-sync" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.476464 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.479245 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qfdgq" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.479571 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.501168 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.634833 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875afdcb-bded-47ea-98e0-2acabc9441ee-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.635032 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5w4\" (UniqueName: \"kubernetes.io/projected/875afdcb-bded-47ea-98e0-2acabc9441ee-kube-api-access-zs5w4\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.635402 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875afdcb-bded-47ea-98e0-2acabc9441ee-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.737224 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875afdcb-bded-47ea-98e0-2acabc9441ee-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.737404 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875afdcb-bded-47ea-98e0-2acabc9441ee-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.738259 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5w4\" (UniqueName: \"kubernetes.io/projected/875afdcb-bded-47ea-98e0-2acabc9441ee-kube-api-access-zs5w4\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.749683 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875afdcb-bded-47ea-98e0-2acabc9441ee-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.751701 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/875afdcb-bded-47ea-98e0-2acabc9441ee-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.755259 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5w4\" (UniqueName: \"kubernetes.io/projected/875afdcb-bded-47ea-98e0-2acabc9441ee-kube-api-access-zs5w4\") pod \"nova-cell0-conductor-0\" (UID: \"875afdcb-bded-47ea-98e0-2acabc9441ee\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:09 crc kubenswrapper[4779]: I0320 15:47:09.798078 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:10 crc kubenswrapper[4779]: I0320 15:47:10.207859 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 15:47:10 crc kubenswrapper[4779]: I0320 15:47:10.406473 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"875afdcb-bded-47ea-98e0-2acabc9441ee","Type":"ContainerStarted","Data":"c8082a460b1b93511a0aa5ce85d5b4217d573959bb0b99f379d1642e92a5509a"} Mar 20 15:47:12 crc kubenswrapper[4779]: I0320 15:47:12.425468 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"875afdcb-bded-47ea-98e0-2acabc9441ee","Type":"ContainerStarted","Data":"8d45e4096d6c5c6f21e15ef4cbd50cf7c2bc128c18696e84a1e5b67909f465bf"} Mar 20 15:47:12 crc kubenswrapper[4779]: I0320 15:47:12.425935 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:12 crc kubenswrapper[4779]: I0320 15:47:12.449658 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.449640077 podStartE2EDuration="3.449640077s" podCreationTimestamp="2026-03-20 15:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:12.442939515 +0000 UTC m=+1449.405455315" watchObservedRunningTime="2026-03-20 15:47:12.449640077 +0000 UTC m=+1449.412155877" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.429390 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f7dnw"] Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.431691 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.454805 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7dnw"] Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.496567 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-catalog-content\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.496649 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7zjl\" (UniqueName: \"kubernetes.io/projected/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-kube-api-access-w7zjl\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.496742 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-utilities\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.599035 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-catalog-content\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.599136 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7zjl\" (UniqueName: \"kubernetes.io/projected/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-kube-api-access-w7zjl\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.599229 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-utilities\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.599642 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-utilities\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.599659 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-catalog-content\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.618656 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7zjl\" (UniqueName: \"kubernetes.io/projected/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-kube-api-access-w7zjl\") pod \"redhat-operators-f7dnw\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:18 crc kubenswrapper[4779]: I0320 15:47:18.758927 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:19 crc kubenswrapper[4779]: W0320 15:47:19.229840 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb4ad3fa_7f7f_4d05_9a9f_2d95c16e0978.slice/crio-bec3f6dd789000765b1f25263b38cc149a0f42564d4d465cec276afced1b83e2 WatchSource:0}: Error finding container bec3f6dd789000765b1f25263b38cc149a0f42564d4d465cec276afced1b83e2: Status 404 returned error can't find the container with id bec3f6dd789000765b1f25263b38cc149a0f42564d4d465cec276afced1b83e2 Mar 20 15:47:19 crc kubenswrapper[4779]: I0320 15:47:19.229854 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7dnw"] Mar 20 15:47:19 crc kubenswrapper[4779]: I0320 15:47:19.489390 4779 generic.go:334] "Generic (PLEG): container finished" podID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerID="7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732" exitCode=0 Mar 20 15:47:19 crc kubenswrapper[4779]: I0320 15:47:19.489469 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7dnw" event={"ID":"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978","Type":"ContainerDied","Data":"7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732"} Mar 20 15:47:19 crc kubenswrapper[4779]: I0320 15:47:19.489727 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7dnw" event={"ID":"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978","Type":"ContainerStarted","Data":"bec3f6dd789000765b1f25263b38cc149a0f42564d4d465cec276afced1b83e2"} Mar 20 15:47:19 crc kubenswrapper[4779]: I0320 15:47:19.823223 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.279812 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-wrqkc"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.282385 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.289719 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.289959 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.312252 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wrqkc"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.335304 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.335403 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc2bx\" (UniqueName: \"kubernetes.io/projected/e379766f-391c-4e7f-8a69-84d1b624d88b-kube-api-access-wc2bx\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.335462 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-scripts\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.335618 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.437621 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.437977 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.438009 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc2bx\" (UniqueName: \"kubernetes.io/projected/e379766f-391c-4e7f-8a69-84d1b624d88b-kube-api-access-wc2bx\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.438053 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-scripts\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.445410 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-scripts\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.465842 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.466671 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.473683 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc2bx\" (UniqueName: \"kubernetes.io/projected/e379766f-391c-4e7f-8a69-84d1b624d88b-kube-api-access-wc2bx\") pod \"nova-cell0-cell-mapping-wrqkc\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.515856 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.517064 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.524036 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.558198 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.597359 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.599391 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.608626 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.622556 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.640659 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.644211 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.644256 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-config-data\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.644398 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-config-data\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.644414 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.644441 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bstsn\" (UniqueName: \"kubernetes.io/projected/40030b72-3377-4060-9a6f-c25011e942ee-kube-api-access-bstsn\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.644479 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmvg\" (UniqueName: \"kubernetes.io/projected/2be8283d-3250-4acc-82f9-9547abbaf7eb-kube-api-access-fvmvg\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.644511 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40030b72-3377-4060-9a6f-c25011e942ee-logs\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.730976 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.742257 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.748967 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-config-data\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.749011 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.749052 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bstsn\" (UniqueName: \"kubernetes.io/projected/40030b72-3377-4060-9a6f-c25011e942ee-kube-api-access-bstsn\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.749101 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmvg\" (UniqueName: \"kubernetes.io/projected/2be8283d-3250-4acc-82f9-9547abbaf7eb-kube-api-access-fvmvg\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.749168 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40030b72-3377-4060-9a6f-c25011e942ee-logs\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.749251 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.749406 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-config-data\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.751624 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.757302 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40030b72-3377-4060-9a6f-c25011e942ee-logs\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.767330 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-config-data\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.769614 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-config-data\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.783430 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.803166 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.809760 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.832721 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmvg\" (UniqueName: \"kubernetes.io/projected/2be8283d-3250-4acc-82f9-9547abbaf7eb-kube-api-access-fvmvg\") pod \"nova-scheduler-0\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.833603 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bstsn\" (UniqueName: \"kubernetes.io/projected/40030b72-3377-4060-9a6f-c25011e942ee-kube-api-access-bstsn\") pod \"nova-api-0\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " pod="openstack/nova-api-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.840500 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.841817 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.845437 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.852601 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.864342 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-config-data\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.864423 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.864592 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2pxm\" (UniqueName: \"kubernetes.io/projected/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-kube-api-access-w2pxm\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.869422 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-logs\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.898771 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-qrvwb"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.902315 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.912800 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-qrvwb"] Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.931808 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971661 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-logs\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971707 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-config\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971735 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-config-data\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971755 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971774 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971794 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971849 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2pxm\" (UniqueName: \"kubernetes.io/projected/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-kube-api-access-w2pxm\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971885 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971901 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971930 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971955 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9qbd\" (UniqueName: \"kubernetes.io/projected/9e97b0f9-1415-425f-822b-8a3a9646a534-kube-api-access-f9qbd\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971973 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbb8\" (UniqueName: \"kubernetes.io/projected/52109ecb-141a-4d8d-95db-5fbc7275f7f1-kube-api-access-rjbb8\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.971987 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.972548 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-logs\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.976398 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-config-data\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.979150 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:20 crc kubenswrapper[4779]: I0320 15:47:20.992011 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2pxm\" (UniqueName: \"kubernetes.io/projected/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-kube-api-access-w2pxm\") pod \"nova-metadata-0\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " pod="openstack/nova-metadata-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.038323 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073316 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-config\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073369 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073390 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073465 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073480 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073506 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073530 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9qbd\" (UniqueName: \"kubernetes.io/projected/9e97b0f9-1415-425f-822b-8a3a9646a534-kube-api-access-f9qbd\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073547 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbb8\" (UniqueName: \"kubernetes.io/projected/52109ecb-141a-4d8d-95db-5fbc7275f7f1-kube-api-access-rjbb8\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.073562 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.078190 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.078238 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-config\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.078905 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.078932 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.079805 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.080302 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.080802 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.086332 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.099996 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbb8\" (UniqueName: \"kubernetes.io/projected/52109ecb-141a-4d8d-95db-5fbc7275f7f1-kube-api-access-rjbb8\") pod \"dnsmasq-dns-845d6d6f59-qrvwb\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.111685 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9qbd\" (UniqueName: \"kubernetes.io/projected/9e97b0f9-1415-425f-822b-8a3a9646a534-kube-api-access-f9qbd\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.178654 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.241386 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.403288 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wrqkc"] Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.540909 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wrqkc" event={"ID":"e379766f-391c-4e7f-8a69-84d1b624d88b","Type":"ContainerStarted","Data":"6a272545ac96a73fb2f314dce61eb63c084fcfb5fa644e38ef300df63c03639a"} Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.542201 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7dnw" event={"ID":"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978","Type":"ContainerStarted","Data":"6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd"} Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.702200 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.764465 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dzzmf"] Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.771560 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.788102 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.788297 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.800057 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dzzmf"] Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.821396 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.821501 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-scripts\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.821542 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl2pk\" (UniqueName: \"kubernetes.io/projected/26c50037-e945-4f6a-8e49-20b4a969a77e-kube-api-access-hl2pk\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.821595 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-config-data\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.855323 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.883129 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.931517 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-scripts\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.931581 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl2pk\" (UniqueName: \"kubernetes.io/projected/26c50037-e945-4f6a-8e49-20b4a969a77e-kube-api-access-hl2pk\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.931672 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-config-data\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.931700 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.938148 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-scripts\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.938259 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.938965 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-config-data\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:21 crc kubenswrapper[4779]: I0320 15:47:21.950573 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl2pk\" (UniqueName: \"kubernetes.io/projected/26c50037-e945-4f6a-8e49-20b4a969a77e-kube-api-access-hl2pk\") pod \"nova-cell1-conductor-db-sync-dzzmf\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.102890 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:22 crc kubenswrapper[4779]: W0320 15:47:22.267307 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e97b0f9_1415_425f_822b_8a3a9646a534.slice/crio-ac4dc967f99fcd77695349d2cd1e3290765bc35e1efb7d2708355b4d5b23f5fb WatchSource:0}: Error finding container ac4dc967f99fcd77695349d2cd1e3290765bc35e1efb7d2708355b4d5b23f5fb: Status 404 returned error can't find the container with id ac4dc967f99fcd77695349d2cd1e3290765bc35e1efb7d2708355b4d5b23f5fb Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.271903 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.337254 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-qrvwb"] Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.574417 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" event={"ID":"52109ecb-141a-4d8d-95db-5fbc7275f7f1","Type":"ContainerStarted","Data":"0986f34f67e867ac6efd7ca3c13f728493d6e31c5fe2e4e62568e324b3ad4671"} Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.593517 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40030b72-3377-4060-9a6f-c25011e942ee","Type":"ContainerStarted","Data":"e15b1db2e30e5791359c5e6aaf2846d0de1f6426487ea55b1e1195841acb3eb7"} Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.595523 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcfdf846-244e-4590-8e01-1f1c6a2ea87e","Type":"ContainerStarted","Data":"3d4147b6a2ed755001bd8b33afdd0b39943135a58a8487569394979d79129d65"} Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.597586 4779 generic.go:334] "Generic (PLEG): container finished" podID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerID="6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd" exitCode=0 Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.597646 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7dnw" event={"ID":"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978","Type":"ContainerDied","Data":"6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd"} Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.600411 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2be8283d-3250-4acc-82f9-9547abbaf7eb","Type":"ContainerStarted","Data":"569d8ac7e1dc612943a9ffe3eb59ee3180da1eff2b93b91f2a9409c11b63a976"} Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.605368 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9e97b0f9-1415-425f-822b-8a3a9646a534","Type":"ContainerStarted","Data":"ac4dc967f99fcd77695349d2cd1e3290765bc35e1efb7d2708355b4d5b23f5fb"} Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.619743 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wrqkc" event={"ID":"e379766f-391c-4e7f-8a69-84d1b624d88b","Type":"ContainerStarted","Data":"c6f752d508e988eaab31e57b3e593cfcccece950051706320499670d5c9c74d2"} Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.633911 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dzzmf"] Mar 20 15:47:22 crc kubenswrapper[4779]: I0320 15:47:22.657431 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-wrqkc" podStartSLOduration=2.657409809 podStartE2EDuration="2.657409809s" podCreationTimestamp="2026-03-20 15:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:22.638184694 +0000 UTC m=+1459.600700494" watchObservedRunningTime="2026-03-20 15:47:22.657409809 +0000 UTC m=+1459.619925609" Mar 20 15:47:23 crc kubenswrapper[4779]: I0320 15:47:23.639089 4779 generic.go:334] "Generic (PLEG): container finished" podID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" containerID="f72088e62a7d2351bbdc2964217e31962afef6df9ccee18b76450df24168acca" exitCode=0 Mar 20 15:47:23 crc kubenswrapper[4779]: I0320 15:47:23.639673 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" event={"ID":"52109ecb-141a-4d8d-95db-5fbc7275f7f1","Type":"ContainerDied","Data":"f72088e62a7d2351bbdc2964217e31962afef6df9ccee18b76450df24168acca"} Mar 20 15:47:23 crc kubenswrapper[4779]: I0320 15:47:23.645291 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" event={"ID":"26c50037-e945-4f6a-8e49-20b4a969a77e","Type":"ContainerStarted","Data":"08368ff48bb95bff64cc68c56cc8e79d1b18031a02cc9749742ca2395f65d492"} Mar 20 15:47:23 crc kubenswrapper[4779]: I0320 15:47:23.645335 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" event={"ID":"26c50037-e945-4f6a-8e49-20b4a969a77e","Type":"ContainerStarted","Data":"acd4f903532591bb24ba1ea698ec8ada480d6cb67d4b6d5e2893054b613c723b"} Mar 20 15:47:23 crc kubenswrapper[4779]: I0320 15:47:23.687313 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" podStartSLOduration=2.687290365 podStartE2EDuration="2.687290365s" podCreationTimestamp="2026-03-20 15:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:23.680045659 +0000 UTC m=+1460.642561479" watchObservedRunningTime="2026-03-20 15:47:23.687290365 +0000 UTC m=+1460.649806165" Mar 20 15:47:24 crc kubenswrapper[4779]: I0320 15:47:24.570740 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:24 crc kubenswrapper[4779]: I0320 15:47:24.581968 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:25 crc kubenswrapper[4779]: I0320 15:47:25.149688 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:47:25 crc kubenswrapper[4779]: I0320 15:47:25.150506 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:47:25 crc kubenswrapper[4779]: I0320 15:47:25.150792 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:47:25 crc kubenswrapper[4779]: I0320 15:47:25.151348 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b4abfdbc7440efebe366f165047da802f1778ef342a7861f64638e51f96112d"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:47:25 crc kubenswrapper[4779]: I0320 15:47:25.151493 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://7b4abfdbc7440efebe366f165047da802f1778ef342a7861f64638e51f96112d" gracePeriod=600 Mar 20 15:47:25 crc kubenswrapper[4779]: I0320 15:47:25.670192 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="7b4abfdbc7440efebe366f165047da802f1778ef342a7861f64638e51f96112d" exitCode=0 Mar 20 15:47:25 crc kubenswrapper[4779]: I0320 15:47:25.670510 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"7b4abfdbc7440efebe366f165047da802f1778ef342a7861f64638e51f96112d"} Mar 20 15:47:25 crc kubenswrapper[4779]: I0320 15:47:25.670547 4779 scope.go:117] "RemoveContainer" containerID="fee94c09103d4b2b4bf88d7de3451b803a74dae16bc0136f3fa09b23c09cfe64" Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.611131 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.700936 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7dnw" event={"ID":"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978","Type":"ContainerStarted","Data":"e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.702967 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2be8283d-3250-4acc-82f9-9547abbaf7eb","Type":"ContainerStarted","Data":"ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.704246 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9e97b0f9-1415-425f-822b-8a3a9646a534","Type":"ContainerStarted","Data":"f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.704351 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9e97b0f9-1415-425f-822b-8a3a9646a534" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390" gracePeriod=30 Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.707156 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" event={"ID":"52109ecb-141a-4d8d-95db-5fbc7275f7f1","Type":"ContainerStarted","Data":"891d90ac45141853aababc917623362409aa5c4133c041f2fde985e6cb7f332d"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.707235 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.709667 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.712144 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40030b72-3377-4060-9a6f-c25011e942ee","Type":"ContainerStarted","Data":"3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.712249 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40030b72-3377-4060-9a6f-c25011e942ee","Type":"ContainerStarted","Data":"4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.714526 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcfdf846-244e-4590-8e01-1f1c6a2ea87e","Type":"ContainerStarted","Data":"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.714927 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcfdf846-244e-4590-8e01-1f1c6a2ea87e","Type":"ContainerStarted","Data":"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3"} Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.714806 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerName="nova-metadata-metadata" containerID="cri-o://3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467" gracePeriod=30 Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.714713 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerName="nova-metadata-log" containerID="cri-o://5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3" gracePeriod=30 Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.722370 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f7dnw" podStartSLOduration=3.892511758 podStartE2EDuration="9.722354056s" podCreationTimestamp="2026-03-20 15:47:18 +0000 UTC" firstStartedPulling="2026-03-20 15:47:19.491305106 +0000 UTC m=+1456.453820906" lastFinishedPulling="2026-03-20 15:47:25.321147394 +0000 UTC m=+1462.283663204" observedRunningTime="2026-03-20 15:47:27.721778011 +0000 UTC m=+1464.684293811" watchObservedRunningTime="2026-03-20 15:47:27.722354056 +0000 UTC m=+1464.684869856" Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.758266 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.930935918 podStartE2EDuration="7.758246862s" podCreationTimestamp="2026-03-20 15:47:20 +0000 UTC" firstStartedPulling="2026-03-20 15:47:21.764457722 +0000 UTC m=+1458.726973532" lastFinishedPulling="2026-03-20 15:47:26.591768676 +0000 UTC m=+1463.554284476" observedRunningTime="2026-03-20 15:47:27.741408396 +0000 UTC m=+1464.703924196" watchObservedRunningTime="2026-03-20 15:47:27.758246862 +0000 UTC m=+1464.720762662" Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.808727 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.111756037 podStartE2EDuration="7.808707472s" podCreationTimestamp="2026-03-20 15:47:20 +0000 UTC" firstStartedPulling="2026-03-20 15:47:21.912487869 +0000 UTC m=+1458.875003669" lastFinishedPulling="2026-03-20 15:47:26.609439284 +0000 UTC m=+1463.571955104" observedRunningTime="2026-03-20 15:47:27.772921337 +0000 UTC m=+1464.735437147" watchObservedRunningTime="2026-03-20 15:47:27.808707472 +0000 UTC m=+1464.771223272" Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.816415 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" podStartSLOduration=7.816397648 podStartE2EDuration="7.816397648s" podCreationTimestamp="2026-03-20 15:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:27.797310596 +0000 UTC m=+1464.759826396" watchObservedRunningTime="2026-03-20 15:47:27.816397648 +0000 UTC m=+1464.778913458" Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.830229 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.024967702 podStartE2EDuration="7.830210292s" podCreationTimestamp="2026-03-20 15:47:20 +0000 UTC" firstStartedPulling="2026-03-20 15:47:21.794246343 +0000 UTC m=+1458.756762143" lastFinishedPulling="2026-03-20 15:47:26.599488933 +0000 UTC m=+1463.562004733" observedRunningTime="2026-03-20 15:47:27.815338572 +0000 UTC m=+1464.777854372" watchObservedRunningTime="2026-03-20 15:47:27.830210292 +0000 UTC m=+1464.792726092" Mar 20 15:47:27 crc kubenswrapper[4779]: I0320 15:47:27.860166 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.524650295 podStartE2EDuration="7.860148305s" podCreationTimestamp="2026-03-20 15:47:20 +0000 UTC" firstStartedPulling="2026-03-20 15:47:22.273889392 +0000 UTC m=+1459.236405192" lastFinishedPulling="2026-03-20 15:47:26.609387402 +0000 UTC m=+1463.571903202" observedRunningTime="2026-03-20 15:47:27.848503434 +0000 UTC m=+1464.811019234" watchObservedRunningTime="2026-03-20 15:47:27.860148305 +0000 UTC m=+1464.822664115" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.322660 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.496944 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-combined-ca-bundle\") pod \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.497175 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2pxm\" (UniqueName: \"kubernetes.io/projected/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-kube-api-access-w2pxm\") pod \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.497211 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-config-data\") pod \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.497236 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-logs\") pod \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\" (UID: \"dcfdf846-244e-4590-8e01-1f1c6a2ea87e\") " Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.497973 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-logs" (OuterVolumeSpecName: "logs") pod "dcfdf846-244e-4590-8e01-1f1c6a2ea87e" (UID: "dcfdf846-244e-4590-8e01-1f1c6a2ea87e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.502737 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-kube-api-access-w2pxm" (OuterVolumeSpecName: "kube-api-access-w2pxm") pod "dcfdf846-244e-4590-8e01-1f1c6a2ea87e" (UID: "dcfdf846-244e-4590-8e01-1f1c6a2ea87e"). InnerVolumeSpecName "kube-api-access-w2pxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.525520 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-config-data" (OuterVolumeSpecName: "config-data") pod "dcfdf846-244e-4590-8e01-1f1c6a2ea87e" (UID: "dcfdf846-244e-4590-8e01-1f1c6a2ea87e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.534665 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcfdf846-244e-4590-8e01-1f1c6a2ea87e" (UID: "dcfdf846-244e-4590-8e01-1f1c6a2ea87e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.599402 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.599433 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2pxm\" (UniqueName: \"kubernetes.io/projected/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-kube-api-access-w2pxm\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.599444 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.599455 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcfdf846-244e-4590-8e01-1f1c6a2ea87e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.725077 4779 generic.go:334] "Generic (PLEG): container finished" podID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerID="3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467" exitCode=0 Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.725126 4779 generic.go:334] "Generic (PLEG): container finished" podID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerID="5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3" exitCode=143 Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.726008 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.734744 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcfdf846-244e-4590-8e01-1f1c6a2ea87e","Type":"ContainerDied","Data":"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467"} Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.734792 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcfdf846-244e-4590-8e01-1f1c6a2ea87e","Type":"ContainerDied","Data":"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3"} Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.734816 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcfdf846-244e-4590-8e01-1f1c6a2ea87e","Type":"ContainerDied","Data":"3d4147b6a2ed755001bd8b33afdd0b39943135a58a8487569394979d79129d65"} Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.734833 4779 scope.go:117] "RemoveContainer" containerID="3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.760719 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.760820 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.766072 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.769904 4779 scope.go:117] "RemoveContainer" containerID="5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.784036 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.799635 4779 scope.go:117] "RemoveContainer" containerID="3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.803667 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:28 crc kubenswrapper[4779]: E0320 15:47:28.804024 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467\": container with ID starting with 3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467 not found: ID does not exist" containerID="3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.804057 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467"} err="failed to get container status \"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467\": rpc error: code = NotFound desc = could not find container \"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467\": container with ID starting with 3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467 not found: ID does not exist" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.804085 4779 scope.go:117] "RemoveContainer" containerID="5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3" Mar 20 15:47:28 crc kubenswrapper[4779]: E0320 15:47:28.804271 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerName="nova-metadata-metadata" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.804287 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerName="nova-metadata-metadata" Mar 20 15:47:28 crc kubenswrapper[4779]: E0320 15:47:28.804314 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerName="nova-metadata-log" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.804320 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerName="nova-metadata-log" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.804486 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerName="nova-metadata-log" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.804512 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" containerName="nova-metadata-metadata" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.805715 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.810140 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 15:47:28 crc kubenswrapper[4779]: E0320 15:47:28.810201 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3\": container with ID starting with 5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3 not found: ID does not exist" containerID="5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.810227 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3"} err="failed to get container status \"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3\": rpc error: code = NotFound desc = could not find container \"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3\": container with ID starting with 5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3 not found: ID does not exist" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.810247 4779 scope.go:117] "RemoveContainer" containerID="3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.811752 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.822079 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.830361 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467"} err="failed to get container status \"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467\": rpc error: code = NotFound desc = could not find container \"3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467\": container with ID starting with 3745bfe7d89b2d71d080b4610f9cf7ed7619c67014aa2c4ca4c7d41507719467 not found: ID does not exist" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.830402 4779 scope.go:117] "RemoveContainer" containerID="5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.833380 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3"} err="failed to get container status \"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3\": rpc error: code = NotFound desc = could not find container \"5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3\": container with ID starting with 5cd9d231ca57debfb092e31a535145c4acb9e8ec700da2772d55f5c9ea22fdc3 not found: ID does not exist" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.905761 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.906172 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.906321 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa6891-80b8-4c90-93a5-889c82b0ca03-logs\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.906403 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdpm\" (UniqueName: \"kubernetes.io/projected/eafa6891-80b8-4c90-93a5-889c82b0ca03-kube-api-access-jwdpm\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:28 crc kubenswrapper[4779]: I0320 15:47:28.907229 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-config-data\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.008981 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa6891-80b8-4c90-93a5-889c82b0ca03-logs\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.009073 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdpm\" (UniqueName: \"kubernetes.io/projected/eafa6891-80b8-4c90-93a5-889c82b0ca03-kube-api-access-jwdpm\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.009171 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-config-data\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.009249 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.009281 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.010542 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa6891-80b8-4c90-93a5-889c82b0ca03-logs\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.015300 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.015369 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.029758 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdpm\" (UniqueName: \"kubernetes.io/projected/eafa6891-80b8-4c90-93a5-889c82b0ca03-kube-api-access-jwdpm\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.030916 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-config-data\") pod \"nova-metadata-0\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.141748 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.607977 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:29 crc kubenswrapper[4779]: W0320 15:47:29.611592 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeafa6891_80b8_4c90_93a5_889c82b0ca03.slice/crio-4871a148df190ee8aeb16bae2f3a59cb01a5d78a223cf52957643f4764ee90ea WatchSource:0}: Error finding container 4871a148df190ee8aeb16bae2f3a59cb01a5d78a223cf52957643f4764ee90ea: Status 404 returned error can't find the container with id 4871a148df190ee8aeb16bae2f3a59cb01a5d78a223cf52957643f4764ee90ea Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.736034 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eafa6891-80b8-4c90-93a5-889c82b0ca03","Type":"ContainerStarted","Data":"4871a148df190ee8aeb16bae2f3a59cb01a5d78a223cf52957643f4764ee90ea"} Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.824094 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcfdf846-244e-4590-8e01-1f1c6a2ea87e" path="/var/lib/kubelet/pods/dcfdf846-244e-4590-8e01-1f1c6a2ea87e/volumes" Mar 20 15:47:29 crc kubenswrapper[4779]: I0320 15:47:29.830332 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7dnw" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="registry-server" probeResult="failure" output=< Mar 20 15:47:29 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 15:47:29 crc kubenswrapper[4779]: > Mar 20 15:47:30 crc kubenswrapper[4779]: I0320 15:47:30.750004 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eafa6891-80b8-4c90-93a5-889c82b0ca03","Type":"ContainerStarted","Data":"111a91579794bf334607b1f80b67ccbdfe253399964cc75b1c32c8aa72a47067"} Mar 20 15:47:30 crc kubenswrapper[4779]: I0320 15:47:30.750392 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eafa6891-80b8-4c90-93a5-889c82b0ca03","Type":"ContainerStarted","Data":"afe8f6516ad048a4992b5f473eec161cd7e7e994a5c76022b8e45012f60efef5"} Mar 20 15:47:30 crc kubenswrapper[4779]: I0320 15:47:30.790810 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.790790609 podStartE2EDuration="2.790790609s" podCreationTimestamp="2026-03-20 15:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:30.777025986 +0000 UTC m=+1467.739541796" watchObservedRunningTime="2026-03-20 15:47:30.790790609 +0000 UTC m=+1467.753306409" Mar 20 15:47:30 crc kubenswrapper[4779]: I0320 15:47:30.933542 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 15:47:30 crc kubenswrapper[4779]: I0320 15:47:30.933594 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 15:47:30 crc kubenswrapper[4779]: I0320 15:47:30.962202 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 15:47:31 crc kubenswrapper[4779]: I0320 15:47:31.040346 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:47:31 crc kubenswrapper[4779]: I0320 15:47:31.048552 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:47:31 crc kubenswrapper[4779]: I0320 15:47:31.180038 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:31 crc kubenswrapper[4779]: I0320 15:47:31.763325 4779 generic.go:334] "Generic (PLEG): container finished" podID="e379766f-391c-4e7f-8a69-84d1b624d88b" containerID="c6f752d508e988eaab31e57b3e593cfcccece950051706320499670d5c9c74d2" exitCode=0 Mar 20 15:47:31 crc kubenswrapper[4779]: I0320 15:47:31.765497 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wrqkc" event={"ID":"e379766f-391c-4e7f-8a69-84d1b624d88b","Type":"ContainerDied","Data":"c6f752d508e988eaab31e57b3e593cfcccece950051706320499670d5c9c74d2"} Mar 20 15:47:31 crc kubenswrapper[4779]: I0320 15:47:31.799003 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.122579 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.123266 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.777047 4779 generic.go:334] "Generic (PLEG): container finished" podID="26c50037-e945-4f6a-8e49-20b4a969a77e" containerID="08368ff48bb95bff64cc68c56cc8e79d1b18031a02cc9749742ca2395f65d492" exitCode=0 Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.777090 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" event={"ID":"26c50037-e945-4f6a-8e49-20b4a969a77e","Type":"ContainerDied","Data":"08368ff48bb95bff64cc68c56cc8e79d1b18031a02cc9749742ca2395f65d492"} Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.785816 4779 generic.go:334] "Generic (PLEG): container finished" podID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerID="51386384fcec7d0366ed3b260c41104c17bf2c7e95563503d509ec6be330d132" exitCode=137 Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.785890 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerDied","Data":"51386384fcec7d0366ed3b260c41104c17bf2c7e95563503d509ec6be330d132"} Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.785944 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64980a81-ec28-4991-924c-b47e95d3fd2e","Type":"ContainerDied","Data":"468935705afaa24fbce74ec2e86c1a483312263fb03fe7917706b64b210b9326"} Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.785961 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468935705afaa24fbce74ec2e86c1a483312263fb03fe7917706b64b210b9326" Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.819323 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.987544 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-sg-core-conf-yaml\") pod \"64980a81-ec28-4991-924c-b47e95d3fd2e\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.987614 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-scripts\") pod \"64980a81-ec28-4991-924c-b47e95d3fd2e\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.987652 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-run-httpd\") pod \"64980a81-ec28-4991-924c-b47e95d3fd2e\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.987735 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-log-httpd\") pod \"64980a81-ec28-4991-924c-b47e95d3fd2e\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.987805 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm2ph\" (UniqueName: \"kubernetes.io/projected/64980a81-ec28-4991-924c-b47e95d3fd2e-kube-api-access-sm2ph\") pod \"64980a81-ec28-4991-924c-b47e95d3fd2e\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.987865 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-combined-ca-bundle\") pod \"64980a81-ec28-4991-924c-b47e95d3fd2e\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.987996 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-config-data\") pod \"64980a81-ec28-4991-924c-b47e95d3fd2e\" (UID: \"64980a81-ec28-4991-924c-b47e95d3fd2e\") " Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.990823 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "64980a81-ec28-4991-924c-b47e95d3fd2e" (UID: "64980a81-ec28-4991-924c-b47e95d3fd2e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.990936 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "64980a81-ec28-4991-924c-b47e95d3fd2e" (UID: "64980a81-ec28-4991-924c-b47e95d3fd2e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.996306 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-scripts" (OuterVolumeSpecName: "scripts") pod "64980a81-ec28-4991-924c-b47e95d3fd2e" (UID: "64980a81-ec28-4991-924c-b47e95d3fd2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:32 crc kubenswrapper[4779]: I0320 15:47:32.996418 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64980a81-ec28-4991-924c-b47e95d3fd2e-kube-api-access-sm2ph" (OuterVolumeSpecName: "kube-api-access-sm2ph") pod "64980a81-ec28-4991-924c-b47e95d3fd2e" (UID: "64980a81-ec28-4991-924c-b47e95d3fd2e"). InnerVolumeSpecName "kube-api-access-sm2ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.024135 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "64980a81-ec28-4991-924c-b47e95d3fd2e" (UID: "64980a81-ec28-4991-924c-b47e95d3fd2e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.080457 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64980a81-ec28-4991-924c-b47e95d3fd2e" (UID: "64980a81-ec28-4991-924c-b47e95d3fd2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.090094 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.090169 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.090178 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.090185 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64980a81-ec28-4991-924c-b47e95d3fd2e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.090193 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm2ph\" (UniqueName: \"kubernetes.io/projected/64980a81-ec28-4991-924c-b47e95d3fd2e-kube-api-access-sm2ph\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.090203 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.095076 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.121198 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-config-data" (OuterVolumeSpecName: "config-data") pod "64980a81-ec28-4991-924c-b47e95d3fd2e" (UID: "64980a81-ec28-4991-924c-b47e95d3fd2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.191250 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-scripts\") pod \"e379766f-391c-4e7f-8a69-84d1b624d88b\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.191335 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc2bx\" (UniqueName: \"kubernetes.io/projected/e379766f-391c-4e7f-8a69-84d1b624d88b-kube-api-access-wc2bx\") pod \"e379766f-391c-4e7f-8a69-84d1b624d88b\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.191421 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data\") pod \"e379766f-391c-4e7f-8a69-84d1b624d88b\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.191453 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-combined-ca-bundle\") pod \"e379766f-391c-4e7f-8a69-84d1b624d88b\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.192042 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64980a81-ec28-4991-924c-b47e95d3fd2e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.195083 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e379766f-391c-4e7f-8a69-84d1b624d88b-kube-api-access-wc2bx" (OuterVolumeSpecName: "kube-api-access-wc2bx") pod "e379766f-391c-4e7f-8a69-84d1b624d88b" (UID: "e379766f-391c-4e7f-8a69-84d1b624d88b"). InnerVolumeSpecName "kube-api-access-wc2bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.196023 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-scripts" (OuterVolumeSpecName: "scripts") pod "e379766f-391c-4e7f-8a69-84d1b624d88b" (UID: "e379766f-391c-4e7f-8a69-84d1b624d88b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:33 crc kubenswrapper[4779]: E0320 15:47:33.219973 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data podName:e379766f-391c-4e7f-8a69-84d1b624d88b nodeName:}" failed. No retries permitted until 2026-03-20 15:47:33.719949826 +0000 UTC m=+1470.682465626 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data") pod "e379766f-391c-4e7f-8a69-84d1b624d88b" (UID: "e379766f-391c-4e7f-8a69-84d1b624d88b") : error deleting /var/lib/kubelet/pods/e379766f-391c-4e7f-8a69-84d1b624d88b/volume-subpaths: remove /var/lib/kubelet/pods/e379766f-391c-4e7f-8a69-84d1b624d88b/volume-subpaths: no such file or directory Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.222936 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e379766f-391c-4e7f-8a69-84d1b624d88b" (UID: "e379766f-391c-4e7f-8a69-84d1b624d88b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.294048 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.294565 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc2bx\" (UniqueName: \"kubernetes.io/projected/e379766f-391c-4e7f-8a69-84d1b624d88b-kube-api-access-wc2bx\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.294658 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.795806 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wrqkc" event={"ID":"e379766f-391c-4e7f-8a69-84d1b624d88b","Type":"ContainerDied","Data":"6a272545ac96a73fb2f314dce61eb63c084fcfb5fa644e38ef300df63c03639a"} Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.796080 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a272545ac96a73fb2f314dce61eb63c084fcfb5fa644e38ef300df63c03639a" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.795851 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wrqkc" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.795833 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.804093 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data\") pod \"e379766f-391c-4e7f-8a69-84d1b624d88b\" (UID: \"e379766f-391c-4e7f-8a69-84d1b624d88b\") " Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.812947 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data" (OuterVolumeSpecName: "config-data") pod "e379766f-391c-4e7f-8a69-84d1b624d88b" (UID: "e379766f-391c-4e7f-8a69-84d1b624d88b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.902166 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.919227 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e379766f-391c-4e7f-8a69-84d1b624d88b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.923514 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.955679 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:47:33 crc kubenswrapper[4779]: E0320 15:47:33.957549 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="ceilometer-notification-agent" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.957590 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="ceilometer-notification-agent" Mar 20 15:47:33 crc kubenswrapper[4779]: E0320 15:47:33.957627 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e379766f-391c-4e7f-8a69-84d1b624d88b" containerName="nova-manage" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.957636 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e379766f-391c-4e7f-8a69-84d1b624d88b" containerName="nova-manage" Mar 20 15:47:33 crc kubenswrapper[4779]: E0320 15:47:33.957674 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="ceilometer-central-agent" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.957680 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="ceilometer-central-agent" Mar 20 15:47:33 crc kubenswrapper[4779]: E0320 15:47:33.957701 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="proxy-httpd" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.957707 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="proxy-httpd" Mar 20 15:47:33 crc kubenswrapper[4779]: E0320 15:47:33.957719 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="sg-core" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.957726 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="sg-core" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.958053 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e379766f-391c-4e7f-8a69-84d1b624d88b" containerName="nova-manage" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.958096 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="proxy-httpd" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.958190 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="ceilometer-central-agent" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.958209 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="sg-core" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.958227 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" containerName="ceilometer-notification-agent" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.962540 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.965347 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:47:33 crc kubenswrapper[4779]: I0320 15:47:33.965772 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:33.991701 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.126535 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-config-data\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.126601 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.126634 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-run-httpd\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.126654 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-scripts\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.126700 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-log-httpd\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.126723 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.126845 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6qhj\" (UniqueName: \"kubernetes.io/projected/216a780a-37b0-4a7a-ace0-4ead2afe215a-kube-api-access-x6qhj\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.148412 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.148675 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-log" containerID="cri-o://4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73" gracePeriod=30 Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.149404 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-api" containerID="cri-o://3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e" gracePeriod=30 Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.207821 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.208049 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2be8283d-3250-4acc-82f9-9547abbaf7eb" containerName="nova-scheduler-scheduler" containerID="cri-o://ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82" gracePeriod=30 Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.229521 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6qhj\" (UniqueName: \"kubernetes.io/projected/216a780a-37b0-4a7a-ace0-4ead2afe215a-kube-api-access-x6qhj\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.229616 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-config-data\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.229650 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.229679 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-run-httpd\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.229699 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-scripts\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.229741 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-log-httpd\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.229764 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.232782 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-run-httpd\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.233641 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-log-httpd\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.234745 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.235329 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-scripts\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.235796 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.236731 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-config-data\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.239395 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.239929 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerName="nova-metadata-metadata" containerID="cri-o://111a91579794bf334607b1f80b67ccbdfe253399964cc75b1c32c8aa72a47067" gracePeriod=30 Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.239860 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerName="nova-metadata-log" containerID="cri-o://afe8f6516ad048a4992b5f473eec161cd7e7e994a5c76022b8e45012f60efef5" gracePeriod=30 Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.268776 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6qhj\" (UniqueName: \"kubernetes.io/projected/216a780a-37b0-4a7a-ace0-4ead2afe215a-kube-api-access-x6qhj\") pod \"ceilometer-0\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.294754 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.581745 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.738758 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-scripts\") pod \"26c50037-e945-4f6a-8e49-20b4a969a77e\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.738860 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-config-data\") pod \"26c50037-e945-4f6a-8e49-20b4a969a77e\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.738891 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl2pk\" (UniqueName: \"kubernetes.io/projected/26c50037-e945-4f6a-8e49-20b4a969a77e-kube-api-access-hl2pk\") pod \"26c50037-e945-4f6a-8e49-20b4a969a77e\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.738973 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-combined-ca-bundle\") pod \"26c50037-e945-4f6a-8e49-20b4a969a77e\" (UID: \"26c50037-e945-4f6a-8e49-20b4a969a77e\") " Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.743415 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-scripts" (OuterVolumeSpecName: "scripts") pod "26c50037-e945-4f6a-8e49-20b4a969a77e" (UID: "26c50037-e945-4f6a-8e49-20b4a969a77e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.744061 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c50037-e945-4f6a-8e49-20b4a969a77e-kube-api-access-hl2pk" (OuterVolumeSpecName: "kube-api-access-hl2pk") pod "26c50037-e945-4f6a-8e49-20b4a969a77e" (UID: "26c50037-e945-4f6a-8e49-20b4a969a77e"). InnerVolumeSpecName "kube-api-access-hl2pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.776672 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26c50037-e945-4f6a-8e49-20b4a969a77e" (UID: "26c50037-e945-4f6a-8e49-20b4a969a77e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.799322 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.806593 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-config-data" (OuterVolumeSpecName: "config-data") pod "26c50037-e945-4f6a-8e49-20b4a969a77e" (UID: "26c50037-e945-4f6a-8e49-20b4a969a77e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.836865 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.837308 4779 generic.go:334] "Generic (PLEG): container finished" podID="40030b72-3377-4060-9a6f-c25011e942ee" containerID="4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73" exitCode=143 Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.837357 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40030b72-3377-4060-9a6f-c25011e942ee","Type":"ContainerDied","Data":"4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73"} Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.841852 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.841878 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.841888 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl2pk\" (UniqueName: \"kubernetes.io/projected/26c50037-e945-4f6a-8e49-20b4a969a77e-kube-api-access-hl2pk\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.841898 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c50037-e945-4f6a-8e49-20b4a969a77e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.845622 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" event={"ID":"26c50037-e945-4f6a-8e49-20b4a969a77e","Type":"ContainerDied","Data":"acd4f903532591bb24ba1ea698ec8ada480d6cb67d4b6d5e2893054b613c723b"} Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.845657 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd4f903532591bb24ba1ea698ec8ada480d6cb67d4b6d5e2893054b613c723b" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.845737 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dzzmf" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.848933 4779 generic.go:334] "Generic (PLEG): container finished" podID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerID="111a91579794bf334607b1f80b67ccbdfe253399964cc75b1c32c8aa72a47067" exitCode=0 Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.848961 4779 generic.go:334] "Generic (PLEG): container finished" podID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerID="afe8f6516ad048a4992b5f473eec161cd7e7e994a5c76022b8e45012f60efef5" exitCode=143 Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.848983 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eafa6891-80b8-4c90-93a5-889c82b0ca03","Type":"ContainerDied","Data":"111a91579794bf334607b1f80b67ccbdfe253399964cc75b1c32c8aa72a47067"} Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.849006 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eafa6891-80b8-4c90-93a5-889c82b0ca03","Type":"ContainerDied","Data":"afe8f6516ad048a4992b5f473eec161cd7e7e994a5c76022b8e45012f60efef5"} Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.849017 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eafa6891-80b8-4c90-93a5-889c82b0ca03","Type":"ContainerDied","Data":"4871a148df190ee8aeb16bae2f3a59cb01a5d78a223cf52957643f4764ee90ea"} Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.849025 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4871a148df190ee8aeb16bae2f3a59cb01a5d78a223cf52957643f4764ee90ea" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.900378 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 15:47:34 crc kubenswrapper[4779]: E0320 15:47:34.900841 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c50037-e945-4f6a-8e49-20b4a969a77e" containerName="nova-cell1-conductor-db-sync" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.900863 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c50037-e945-4f6a-8e49-20b4a969a77e" containerName="nova-cell1-conductor-db-sync" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.901083 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c50037-e945-4f6a-8e49-20b4a969a77e" containerName="nova-cell1-conductor-db-sync" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.901770 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.903881 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.917766 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 15:47:34 crc kubenswrapper[4779]: I0320 15:47:34.940506 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.045624 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa6891-80b8-4c90-93a5-889c82b0ca03-logs\") pod \"eafa6891-80b8-4c90-93a5-889c82b0ca03\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.045679 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-config-data\") pod \"eafa6891-80b8-4c90-93a5-889c82b0ca03\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.045842 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-combined-ca-bundle\") pod \"eafa6891-80b8-4c90-93a5-889c82b0ca03\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.045934 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-nova-metadata-tls-certs\") pod \"eafa6891-80b8-4c90-93a5-889c82b0ca03\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.045974 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwdpm\" (UniqueName: \"kubernetes.io/projected/eafa6891-80b8-4c90-93a5-889c82b0ca03-kube-api-access-jwdpm\") pod \"eafa6891-80b8-4c90-93a5-889c82b0ca03\" (UID: \"eafa6891-80b8-4c90-93a5-889c82b0ca03\") " Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.046261 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54560a34-b088-437c-8fbc-66c38e38a81a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.046482 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvcfr\" (UniqueName: \"kubernetes.io/projected/54560a34-b088-437c-8fbc-66c38e38a81a-kube-api-access-bvcfr\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.046555 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54560a34-b088-437c-8fbc-66c38e38a81a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.046882 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eafa6891-80b8-4c90-93a5-889c82b0ca03-logs" (OuterVolumeSpecName: "logs") pod "eafa6891-80b8-4c90-93a5-889c82b0ca03" (UID: "eafa6891-80b8-4c90-93a5-889c82b0ca03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.052970 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafa6891-80b8-4c90-93a5-889c82b0ca03-kube-api-access-jwdpm" (OuterVolumeSpecName: "kube-api-access-jwdpm") pod "eafa6891-80b8-4c90-93a5-889c82b0ca03" (UID: "eafa6891-80b8-4c90-93a5-889c82b0ca03"). InnerVolumeSpecName "kube-api-access-jwdpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.075147 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eafa6891-80b8-4c90-93a5-889c82b0ca03" (UID: "eafa6891-80b8-4c90-93a5-889c82b0ca03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.077296 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-config-data" (OuterVolumeSpecName: "config-data") pod "eafa6891-80b8-4c90-93a5-889c82b0ca03" (UID: "eafa6891-80b8-4c90-93a5-889c82b0ca03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.108628 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "eafa6891-80b8-4c90-93a5-889c82b0ca03" (UID: "eafa6891-80b8-4c90-93a5-889c82b0ca03"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.147804 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvcfr\" (UniqueName: \"kubernetes.io/projected/54560a34-b088-437c-8fbc-66c38e38a81a-kube-api-access-bvcfr\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.147890 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54560a34-b088-437c-8fbc-66c38e38a81a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.147955 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54560a34-b088-437c-8fbc-66c38e38a81a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.148120 4779 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.148137 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwdpm\" (UniqueName: \"kubernetes.io/projected/eafa6891-80b8-4c90-93a5-889c82b0ca03-kube-api-access-jwdpm\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.148149 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa6891-80b8-4c90-93a5-889c82b0ca03-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.148159 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.148169 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa6891-80b8-4c90-93a5-889c82b0ca03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.151198 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54560a34-b088-437c-8fbc-66c38e38a81a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.155385 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54560a34-b088-437c-8fbc-66c38e38a81a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.164187 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvcfr\" (UniqueName: \"kubernetes.io/projected/54560a34-b088-437c-8fbc-66c38e38a81a-kube-api-access-bvcfr\") pod \"nova-cell1-conductor-0\" (UID: \"54560a34-b088-437c-8fbc-66c38e38a81a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:35 crc kubenswrapper[4779]: I0320 15:47:35.257298 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:35.830721 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64980a81-ec28-4991-924c-b47e95d3fd2e" path="/var/lib/kubelet/pods/64980a81-ec28-4991-924c-b47e95d3fd2e/volumes" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:35.863015 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:35.864505 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerStarted","Data":"3dc6b2ac12ca230e6910a87e6f8c84a12d062195c027f424025ed1dcaab74644"} Mar 20 15:47:36 crc kubenswrapper[4779]: E0320 15:47:35.937200 4779 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:47:36 crc kubenswrapper[4779]: E0320 15:47:35.941315 4779 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:47:36 crc kubenswrapper[4779]: E0320 15:47:35.944201 4779 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:47:36 crc kubenswrapper[4779]: E0320 15:47:35.944243 4779 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2be8283d-3250-4acc-82f9-9547abbaf7eb" containerName="nova-scheduler-scheduler" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:35.998155 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.031939 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.045237 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:36 crc kubenswrapper[4779]: E0320 15:47:36.045762 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerName="nova-metadata-log" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.045776 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerName="nova-metadata-log" Mar 20 15:47:36 crc kubenswrapper[4779]: E0320 15:47:36.045803 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerName="nova-metadata-metadata" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.045809 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerName="nova-metadata-metadata" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.046043 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerName="nova-metadata-metadata" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.046053 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" containerName="nova-metadata-log" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.047181 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.051086 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.051347 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.060500 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.166569 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc56880-51cd-416e-a08c-1584fd7b5091-logs\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.167282 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-config-data\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.167409 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.167435 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84trj\" (UniqueName: \"kubernetes.io/projected/0bc56880-51cd-416e-a08c-1584fd7b5091-kube-api-access-84trj\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.167476 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.243297 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.275340 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc56880-51cd-416e-a08c-1584fd7b5091-logs\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.275417 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-config-data\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.275631 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.275670 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84trj\" (UniqueName: \"kubernetes.io/projected/0bc56880-51cd-416e-a08c-1584fd7b5091-kube-api-access-84trj\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.275738 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.282216 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-config-data\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.282541 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc56880-51cd-416e-a08c-1584fd7b5091-logs\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.285050 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.308688 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.309139 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84trj\" (UniqueName: \"kubernetes.io/projected/0bc56880-51cd-416e-a08c-1584fd7b5091-kube-api-access-84trj\") pod \"nova-metadata-0\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.323957 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w6689"] Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.324377 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-w6689" podUID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerName="dnsmasq-dns" containerID="cri-o://3c989f1298a779f1a951476723cd69c30850eb7c6bf3b8650d83df1962657960" gracePeriod=10 Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.376507 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.441566 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.877216 4779 generic.go:334] "Generic (PLEG): container finished" podID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerID="3c989f1298a779f1a951476723cd69c30850eb7c6bf3b8650d83df1962657960" exitCode=0 Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.877285 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w6689" event={"ID":"07fedf37-d83d-4c66-96f7-8b9dab543e45","Type":"ContainerDied","Data":"3c989f1298a779f1a951476723cd69c30850eb7c6bf3b8650d83df1962657960"} Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.877712 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w6689" event={"ID":"07fedf37-d83d-4c66-96f7-8b9dab543e45","Type":"ContainerDied","Data":"9914c8bee9158fcb8bf2860b610e5c2ccb6e7c2b5d662d90d6250d92563c712f"} Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.877731 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9914c8bee9158fcb8bf2860b610e5c2ccb6e7c2b5d662d90d6250d92563c712f" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.881237 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"54560a34-b088-437c-8fbc-66c38e38a81a","Type":"ContainerStarted","Data":"69fe2a3ef05dfe956a8973c2f4dd03ff2388db01df388ee313311a3c165625b8"} Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.881282 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"54560a34-b088-437c-8fbc-66c38e38a81a","Type":"ContainerStarted","Data":"87bcbec689661d2d88453f556407a7ee13491488f79c6238085ac3ccda5ae264"} Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.881651 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.897612 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerStarted","Data":"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59"} Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.915054 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.915035821 podStartE2EDuration="2.915035821s" podCreationTimestamp="2026-03-20 15:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:36.904446206 +0000 UTC m=+1473.866962006" watchObservedRunningTime="2026-03-20 15:47:36.915035821 +0000 UTC m=+1473.877551621" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.936908 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:47:36 crc kubenswrapper[4779]: I0320 15:47:36.984654 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.093185 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-config\") pod \"07fedf37-d83d-4c66-96f7-8b9dab543e45\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.093314 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-svc\") pod \"07fedf37-d83d-4c66-96f7-8b9dab543e45\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.093369 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-swift-storage-0\") pod \"07fedf37-d83d-4c66-96f7-8b9dab543e45\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.093477 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlx4m\" (UniqueName: \"kubernetes.io/projected/07fedf37-d83d-4c66-96f7-8b9dab543e45-kube-api-access-tlx4m\") pod \"07fedf37-d83d-4c66-96f7-8b9dab543e45\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.093536 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-sb\") pod \"07fedf37-d83d-4c66-96f7-8b9dab543e45\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.093582 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-nb\") pod \"07fedf37-d83d-4c66-96f7-8b9dab543e45\" (UID: \"07fedf37-d83d-4c66-96f7-8b9dab543e45\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.100231 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07fedf37-d83d-4c66-96f7-8b9dab543e45-kube-api-access-tlx4m" (OuterVolumeSpecName: "kube-api-access-tlx4m") pod "07fedf37-d83d-4c66-96f7-8b9dab543e45" (UID: "07fedf37-d83d-4c66-96f7-8b9dab543e45"). InnerVolumeSpecName "kube-api-access-tlx4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.183198 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07fedf37-d83d-4c66-96f7-8b9dab543e45" (UID: "07fedf37-d83d-4c66-96f7-8b9dab543e45"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.184872 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07fedf37-d83d-4c66-96f7-8b9dab543e45" (UID: "07fedf37-d83d-4c66-96f7-8b9dab543e45"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.196098 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlx4m\" (UniqueName: \"kubernetes.io/projected/07fedf37-d83d-4c66-96f7-8b9dab543e45-kube-api-access-tlx4m\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.196178 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.196189 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.196094 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07fedf37-d83d-4c66-96f7-8b9dab543e45" (UID: "07fedf37-d83d-4c66-96f7-8b9dab543e45"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.210138 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07fedf37-d83d-4c66-96f7-8b9dab543e45" (UID: "07fedf37-d83d-4c66-96f7-8b9dab543e45"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.219081 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-config" (OuterVolumeSpecName: "config") pod "07fedf37-d83d-4c66-96f7-8b9dab543e45" (UID: "07fedf37-d83d-4c66-96f7-8b9dab543e45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.297490 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.297521 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.297531 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07fedf37-d83d-4c66-96f7-8b9dab543e45-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.782651 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.822637 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafa6891-80b8-4c90-93a5-889c82b0ca03" path="/var/lib/kubelet/pods/eafa6891-80b8-4c90-93a5-889c82b0ca03/volumes" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.908523 4779 generic.go:334] "Generic (PLEG): container finished" podID="40030b72-3377-4060-9a6f-c25011e942ee" containerID="3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e" exitCode=0 Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.908570 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40030b72-3377-4060-9a6f-c25011e942ee","Type":"ContainerDied","Data":"3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e"} Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.908618 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40030b72-3377-4060-9a6f-c25011e942ee","Type":"ContainerDied","Data":"e15b1db2e30e5791359c5e6aaf2846d0de1f6426487ea55b1e1195841acb3eb7"} Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.908624 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.908635 4779 scope.go:117] "RemoveContainer" containerID="3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.909846 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bstsn\" (UniqueName: \"kubernetes.io/projected/40030b72-3377-4060-9a6f-c25011e942ee-kube-api-access-bstsn\") pod \"40030b72-3377-4060-9a6f-c25011e942ee\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.909942 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40030b72-3377-4060-9a6f-c25011e942ee-logs\") pod \"40030b72-3377-4060-9a6f-c25011e942ee\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.909974 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-combined-ca-bundle\") pod \"40030b72-3377-4060-9a6f-c25011e942ee\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.910020 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-config-data\") pod \"40030b72-3377-4060-9a6f-c25011e942ee\" (UID: \"40030b72-3377-4060-9a6f-c25011e942ee\") " Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.911163 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40030b72-3377-4060-9a6f-c25011e942ee-logs" (OuterVolumeSpecName: "logs") pod "40030b72-3377-4060-9a6f-c25011e942ee" (UID: "40030b72-3377-4060-9a6f-c25011e942ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.915265 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40030b72-3377-4060-9a6f-c25011e942ee-kube-api-access-bstsn" (OuterVolumeSpecName: "kube-api-access-bstsn") pod "40030b72-3377-4060-9a6f-c25011e942ee" (UID: "40030b72-3377-4060-9a6f-c25011e942ee"). InnerVolumeSpecName "kube-api-access-bstsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.919400 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bc56880-51cd-416e-a08c-1584fd7b5091","Type":"ContainerStarted","Data":"c74700b8bd64f8c4618fc6dd8e8c884983ee239ea75c2962ec4b8dec07a50c97"} Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.919444 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bc56880-51cd-416e-a08c-1584fd7b5091","Type":"ContainerStarted","Data":"531d688066602d92287c84d347ac4e65aedfd23f1b0a9b6c9a2fdb395a9e1425"} Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.919453 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bc56880-51cd-416e-a08c-1584fd7b5091","Type":"ContainerStarted","Data":"513663f8832fa736d5238ee5ec7978a330c7b7f2aafd54d9c14aa4303709543a"} Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.929850 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-w6689" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.930888 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerStarted","Data":"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822"} Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.930922 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerStarted","Data":"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c"} Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.933326 4779 scope.go:117] "RemoveContainer" containerID="4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.942042 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-config-data" (OuterVolumeSpecName: "config-data") pod "40030b72-3377-4060-9a6f-c25011e942ee" (UID: "40030b72-3377-4060-9a6f-c25011e942ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.942673 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.942652382 podStartE2EDuration="1.942652382s" podCreationTimestamp="2026-03-20 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:37.939372803 +0000 UTC m=+1474.901888613" watchObservedRunningTime="2026-03-20 15:47:37.942652382 +0000 UTC m=+1474.905168182" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.942860 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40030b72-3377-4060-9a6f-c25011e942ee" (UID: "40030b72-3377-4060-9a6f-c25011e942ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.966439 4779 scope.go:117] "RemoveContainer" containerID="3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e" Mar 20 15:47:37 crc kubenswrapper[4779]: E0320 15:47:37.970247 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e\": container with ID starting with 3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e not found: ID does not exist" containerID="3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.970284 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e"} err="failed to get container status \"3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e\": rpc error: code = NotFound desc = could not find container \"3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e\": container with ID starting with 3ce7bd33a3c642d108c31e0f5c47a2d0d8e37b456af36a59c6189d92ac81e44e not found: ID does not exist" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.970305 4779 scope.go:117] "RemoveContainer" containerID="4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73" Mar 20 15:47:37 crc kubenswrapper[4779]: E0320 15:47:37.971654 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73\": container with ID starting with 4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73 not found: ID does not exist" containerID="4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.971684 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73"} err="failed to get container status \"4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73\": rpc error: code = NotFound desc = could not find container \"4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73\": container with ID starting with 4c8dd4a2f5c38849ef9b520a8e33cb3d263b408e3ddf028f5dcbcc99f30f3d73 not found: ID does not exist" Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.972646 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w6689"] Mar 20 15:47:37 crc kubenswrapper[4779]: I0320 15:47:37.982087 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w6689"] Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.012941 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40030b72-3377-4060-9a6f-c25011e942ee-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.012977 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.012992 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40030b72-3377-4060-9a6f-c25011e942ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.013003 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bstsn\" (UniqueName: \"kubernetes.io/projected/40030b72-3377-4060-9a6f-c25011e942ee-kube-api-access-bstsn\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.264168 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.285545 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.326205 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:38 crc kubenswrapper[4779]: E0320 15:47:38.326658 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-log" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.326676 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-log" Mar 20 15:47:38 crc kubenswrapper[4779]: E0320 15:47:38.326687 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-api" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.326694 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-api" Mar 20 15:47:38 crc kubenswrapper[4779]: E0320 15:47:38.326703 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerName="init" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.326709 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerName="init" Mar 20 15:47:38 crc kubenswrapper[4779]: E0320 15:47:38.326726 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerName="dnsmasq-dns" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.326732 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerName="dnsmasq-dns" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.326913 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-api" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.326928 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerName="dnsmasq-dns" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.326942 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="40030b72-3377-4060-9a6f-c25011e942ee" containerName="nova-api-log" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.328185 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.330451 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.336076 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.419297 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-config-data\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.419407 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmfs\" (UniqueName: \"kubernetes.io/projected/6c20508b-1ebb-4a45-b997-6ae1ceb24012-kube-api-access-vhmfs\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.419465 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c20508b-1ebb-4a45-b997-6ae1ceb24012-logs\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.419529 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.524444 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-config-data\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.524563 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmfs\" (UniqueName: \"kubernetes.io/projected/6c20508b-1ebb-4a45-b997-6ae1ceb24012-kube-api-access-vhmfs\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.524608 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c20508b-1ebb-4a45-b997-6ae1ceb24012-logs\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.524659 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.525198 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c20508b-1ebb-4a45-b997-6ae1ceb24012-logs\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.536393 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.542191 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-config-data\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.542569 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmfs\" (UniqueName: \"kubernetes.io/projected/6c20508b-1ebb-4a45-b997-6ae1ceb24012-kube-api-access-vhmfs\") pod \"nova-api-0\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.693625 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.830812 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.936219 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-config-data\") pod \"2be8283d-3250-4acc-82f9-9547abbaf7eb\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.936476 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvmvg\" (UniqueName: \"kubernetes.io/projected/2be8283d-3250-4acc-82f9-9547abbaf7eb-kube-api-access-fvmvg\") pod \"2be8283d-3250-4acc-82f9-9547abbaf7eb\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.938984 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-combined-ca-bundle\") pod \"2be8283d-3250-4acc-82f9-9547abbaf7eb\" (UID: \"2be8283d-3250-4acc-82f9-9547abbaf7eb\") " Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.945285 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be8283d-3250-4acc-82f9-9547abbaf7eb-kube-api-access-fvmvg" (OuterVolumeSpecName: "kube-api-access-fvmvg") pod "2be8283d-3250-4acc-82f9-9547abbaf7eb" (UID: "2be8283d-3250-4acc-82f9-9547abbaf7eb"). InnerVolumeSpecName "kube-api-access-fvmvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.948082 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvmvg\" (UniqueName: \"kubernetes.io/projected/2be8283d-3250-4acc-82f9-9547abbaf7eb-kube-api-access-fvmvg\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.959815 4779 generic.go:334] "Generic (PLEG): container finished" podID="2be8283d-3250-4acc-82f9-9547abbaf7eb" containerID="ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82" exitCode=0 Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.959913 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2be8283d-3250-4acc-82f9-9547abbaf7eb","Type":"ContainerDied","Data":"ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82"} Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.959959 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2be8283d-3250-4acc-82f9-9547abbaf7eb","Type":"ContainerDied","Data":"569d8ac7e1dc612943a9ffe3eb59ee3180da1eff2b93b91f2a9409c11b63a976"} Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.959979 4779 scope.go:117] "RemoveContainer" containerID="ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.960022 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.978029 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2be8283d-3250-4acc-82f9-9547abbaf7eb" (UID: "2be8283d-3250-4acc-82f9-9547abbaf7eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:38 crc kubenswrapper[4779]: I0320 15:47:38.980247 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-config-data" (OuterVolumeSpecName: "config-data") pod "2be8283d-3250-4acc-82f9-9547abbaf7eb" (UID: "2be8283d-3250-4acc-82f9-9547abbaf7eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.006795 4779 scope.go:117] "RemoveContainer" containerID="ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82" Mar 20 15:47:39 crc kubenswrapper[4779]: E0320 15:47:39.007285 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82\": container with ID starting with ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82 not found: ID does not exist" containerID="ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.007323 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82"} err="failed to get container status \"ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82\": rpc error: code = NotFound desc = could not find container \"ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82\": container with ID starting with ff8788ad8daa49fa500be9bd22ab80e288b72246ced186231e1c4724e8ed7a82 not found: ID does not exist" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.050126 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.050164 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be8283d-3250-4acc-82f9-9547abbaf7eb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.213074 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:47:39 crc kubenswrapper[4779]: W0320 15:47:39.218318 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c20508b_1ebb_4a45_b997_6ae1ceb24012.slice/crio-520c257d12a3069580d08526996a9851028d981db5107a9dd0d11b5626129c4b WatchSource:0}: Error finding container 520c257d12a3069580d08526996a9851028d981db5107a9dd0d11b5626129c4b: Status 404 returned error can't find the container with id 520c257d12a3069580d08526996a9851028d981db5107a9dd0d11b5626129c4b Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.316651 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.338340 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.372270 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:39 crc kubenswrapper[4779]: E0320 15:47:39.372817 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be8283d-3250-4acc-82f9-9547abbaf7eb" containerName="nova-scheduler-scheduler" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.372842 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be8283d-3250-4acc-82f9-9547abbaf7eb" containerName="nova-scheduler-scheduler" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.373090 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be8283d-3250-4acc-82f9-9547abbaf7eb" containerName="nova-scheduler-scheduler" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.374683 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.376836 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.396646 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.456762 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-config-data\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.456843 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfndd\" (UniqueName: \"kubernetes.io/projected/57169c3f-afba-425a-9c80-a7698c0151ce-kube-api-access-lfndd\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.456923 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.559155 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.559455 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-config-data\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.559757 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfndd\" (UniqueName: \"kubernetes.io/projected/57169c3f-afba-425a-9c80-a7698c0151ce-kube-api-access-lfndd\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.562975 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.563911 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-config-data\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.577784 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfndd\" (UniqueName: \"kubernetes.io/projected/57169c3f-afba-425a-9c80-a7698c0151ce-kube-api-access-lfndd\") pod \"nova-scheduler-0\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.696391 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.834294 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07fedf37-d83d-4c66-96f7-8b9dab543e45" path="/var/lib/kubelet/pods/07fedf37-d83d-4c66-96f7-8b9dab543e45/volumes" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.836274 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be8283d-3250-4acc-82f9-9547abbaf7eb" path="/var/lib/kubelet/pods/2be8283d-3250-4acc-82f9-9547abbaf7eb/volumes" Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.837449 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7dnw" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="registry-server" probeResult="failure" output=< Mar 20 15:47:39 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 15:47:39 crc kubenswrapper[4779]: > Mar 20 15:47:39 crc kubenswrapper[4779]: I0320 15:47:39.841416 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40030b72-3377-4060-9a6f-c25011e942ee" path="/var/lib/kubelet/pods/40030b72-3377-4060-9a6f-c25011e942ee/volumes" Mar 20 15:47:40 crc kubenswrapper[4779]: I0320 15:47:40.005977 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c20508b-1ebb-4a45-b997-6ae1ceb24012","Type":"ContainerStarted","Data":"935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838"} Mar 20 15:47:40 crc kubenswrapper[4779]: I0320 15:47:40.006025 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c20508b-1ebb-4a45-b997-6ae1ceb24012","Type":"ContainerStarted","Data":"09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6"} Mar 20 15:47:40 crc kubenswrapper[4779]: I0320 15:47:40.006035 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c20508b-1ebb-4a45-b997-6ae1ceb24012","Type":"ContainerStarted","Data":"520c257d12a3069580d08526996a9851028d981db5107a9dd0d11b5626129c4b"} Mar 20 15:47:40 crc kubenswrapper[4779]: I0320 15:47:40.033070 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.033048343 podStartE2EDuration="2.033048343s" podCreationTimestamp="2026-03-20 15:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:40.030595984 +0000 UTC m=+1476.993111784" watchObservedRunningTime="2026-03-20 15:47:40.033048343 +0000 UTC m=+1476.995564143" Mar 20 15:47:40 crc kubenswrapper[4779]: I0320 15:47:40.157699 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:47:41 crc kubenswrapper[4779]: I0320 15:47:41.043825 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"57169c3f-afba-425a-9c80-a7698c0151ce","Type":"ContainerStarted","Data":"307555d734d8c31719459a6fab20efd0f6af775f1498d19e9deb098d8b393047"} Mar 20 15:47:41 crc kubenswrapper[4779]: I0320 15:47:41.044315 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"57169c3f-afba-425a-9c80-a7698c0151ce","Type":"ContainerStarted","Data":"cca43740745b6fd0b4b203f5c1aa0d64dcd210b976fbfd52da6f5fa7ac54e9d4"} Mar 20 15:47:41 crc kubenswrapper[4779]: I0320 15:47:41.049608 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerStarted","Data":"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1"} Mar 20 15:47:41 crc kubenswrapper[4779]: I0320 15:47:41.049787 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:47:41 crc kubenswrapper[4779]: I0320 15:47:41.071480 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.071415414 podStartE2EDuration="2.071415414s" podCreationTimestamp="2026-03-20 15:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:47:41.065157062 +0000 UTC m=+1478.027672862" watchObservedRunningTime="2026-03-20 15:47:41.071415414 +0000 UTC m=+1478.033931214" Mar 20 15:47:41 crc kubenswrapper[4779]: I0320 15:47:41.091374 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.858331668 podStartE2EDuration="8.091352655s" podCreationTimestamp="2026-03-20 15:47:33 +0000 UTC" firstStartedPulling="2026-03-20 15:47:34.836652141 +0000 UTC m=+1471.799167941" lastFinishedPulling="2026-03-20 15:47:40.069673128 +0000 UTC m=+1477.032188928" observedRunningTime="2026-03-20 15:47:41.083800692 +0000 UTC m=+1478.046316492" watchObservedRunningTime="2026-03-20 15:47:41.091352655 +0000 UTC m=+1478.053868455" Mar 20 15:47:41 crc kubenswrapper[4779]: I0320 15:47:41.733755 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5784cf869f-w6689" podUID="07fedf37-d83d-4c66-96f7-8b9dab543e45" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.183:5353: i/o timeout" Mar 20 15:47:44 crc kubenswrapper[4779]: I0320 15:47:44.696984 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 15:47:45 crc kubenswrapper[4779]: I0320 15:47:45.287972 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 15:47:46 crc kubenswrapper[4779]: I0320 15:47:46.377738 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 15:47:46 crc kubenswrapper[4779]: I0320 15:47:46.378048 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 15:47:47 crc kubenswrapper[4779]: I0320 15:47:47.393285 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:47:47 crc kubenswrapper[4779]: I0320 15:47:47.393304 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:47:48 crc kubenswrapper[4779]: I0320 15:47:48.694763 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:47:48 crc kubenswrapper[4779]: I0320 15:47:48.694827 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:47:49 crc kubenswrapper[4779]: I0320 15:47:49.697685 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 15:47:49 crc kubenswrapper[4779]: I0320 15:47:49.727457 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 15:47:49 crc kubenswrapper[4779]: I0320 15:47:49.778399 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:47:49 crc kubenswrapper[4779]: I0320 15:47:49.778390 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:47:49 crc kubenswrapper[4779]: I0320 15:47:49.805575 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7dnw" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="registry-server" probeResult="failure" output=< Mar 20 15:47:49 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 15:47:49 crc kubenswrapper[4779]: > Mar 20 15:47:50 crc kubenswrapper[4779]: I0320 15:47:50.148366 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 15:47:54 crc kubenswrapper[4779]: I0320 15:47:54.377143 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 15:47:54 crc kubenswrapper[4779]: I0320 15:47:54.377516 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 15:47:56 crc kubenswrapper[4779]: I0320 15:47:56.382539 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 15:47:56 crc kubenswrapper[4779]: I0320 15:47:56.383307 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 15:47:56 crc kubenswrapper[4779]: I0320 15:47:56.388162 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 15:47:56 crc kubenswrapper[4779]: I0320 15:47:56.389486 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 15:47:56 crc kubenswrapper[4779]: I0320 15:47:56.693955 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 15:47:56 crc kubenswrapper[4779]: I0320 15:47:56.694298 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.128364 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.200435 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.200528 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9e97b0f9-1415-425f-822b-8a3a9646a534","Type":"ContainerDied","Data":"f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390"} Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.200595 4779 scope.go:117] "RemoveContainer" containerID="f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.200548 4779 generic.go:334] "Generic (PLEG): container finished" podID="9e97b0f9-1415-425f-822b-8a3a9646a534" containerID="f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390" exitCode=137 Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.200923 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9e97b0f9-1415-425f-822b-8a3a9646a534","Type":"ContainerDied","Data":"ac4dc967f99fcd77695349d2cd1e3290765bc35e1efb7d2708355b4d5b23f5fb"} Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.232598 4779 scope.go:117] "RemoveContainer" containerID="f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390" Mar 20 15:47:58 crc kubenswrapper[4779]: E0320 15:47:58.233343 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390\": container with ID starting with f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390 not found: ID does not exist" containerID="f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.233386 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390"} err="failed to get container status \"f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390\": rpc error: code = NotFound desc = could not find container \"f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390\": container with ID starting with f41876be69eae02d6113443286a2f1d0fd11f8984c31c834df15aa4040a92390 not found: ID does not exist" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.236444 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9qbd\" (UniqueName: \"kubernetes.io/projected/9e97b0f9-1415-425f-822b-8a3a9646a534-kube-api-access-f9qbd\") pod \"9e97b0f9-1415-425f-822b-8a3a9646a534\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.236538 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-config-data\") pod \"9e97b0f9-1415-425f-822b-8a3a9646a534\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.237651 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-combined-ca-bundle\") pod \"9e97b0f9-1415-425f-822b-8a3a9646a534\" (UID: \"9e97b0f9-1415-425f-822b-8a3a9646a534\") " Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.242264 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e97b0f9-1415-425f-822b-8a3a9646a534-kube-api-access-f9qbd" (OuterVolumeSpecName: "kube-api-access-f9qbd") pod "9e97b0f9-1415-425f-822b-8a3a9646a534" (UID: "9e97b0f9-1415-425f-822b-8a3a9646a534"). InnerVolumeSpecName "kube-api-access-f9qbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.262858 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-config-data" (OuterVolumeSpecName: "config-data") pod "9e97b0f9-1415-425f-822b-8a3a9646a534" (UID: "9e97b0f9-1415-425f-822b-8a3a9646a534"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.267329 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e97b0f9-1415-425f-822b-8a3a9646a534" (UID: "9e97b0f9-1415-425f-822b-8a3a9646a534"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.340626 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9qbd\" (UniqueName: \"kubernetes.io/projected/9e97b0f9-1415-425f-822b-8a3a9646a534-kube-api-access-f9qbd\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.340677 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.340690 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e97b0f9-1415-425f-822b-8a3a9646a534-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.543886 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.557467 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.569809 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:58 crc kubenswrapper[4779]: E0320 15:47:58.570375 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e97b0f9-1415-425f-822b-8a3a9646a534" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.570397 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e97b0f9-1415-425f-822b-8a3a9646a534" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.570653 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e97b0f9-1415-425f-822b-8a3a9646a534" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.571514 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.573846 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.574821 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.577867 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.599881 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.697598 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.698229 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.700977 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.748481 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.748574 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.748611 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.748642 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9khs\" (UniqueName: \"kubernetes.io/projected/4f67aa28-790a-4491-bac4-a64e60acf7ef-kube-api-access-j9khs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.748755 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.850761 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.851157 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.851298 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.851354 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.851401 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9khs\" (UniqueName: \"kubernetes.io/projected/4f67aa28-790a-4491-bac4-a64e60acf7ef-kube-api-access-j9khs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.855916 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.856504 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.859732 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.861735 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f67aa28-790a-4491-bac4-a64e60acf7ef-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.872952 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9khs\" (UniqueName: \"kubernetes.io/projected/4f67aa28-790a-4491-bac4-a64e60acf7ef-kube-api-access-j9khs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f67aa28-790a-4491-bac4-a64e60acf7ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:58 crc kubenswrapper[4779]: I0320 15:47:58.898096 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.220345 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.400429 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.500164 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l62ft"] Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.501951 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.543239 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l62ft"] Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.693030 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.693359 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-config\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.693418 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj2tr\" (UniqueName: \"kubernetes.io/projected/faefbbf5-3a2e-4938-821d-c5737c066de5-kube-api-access-xj2tr\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.693496 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.693690 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.693797 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.795084 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.795143 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-config\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.795214 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj2tr\" (UniqueName: \"kubernetes.io/projected/faefbbf5-3a2e-4938-821d-c5737c066de5-kube-api-access-xj2tr\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.795285 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.795351 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.795395 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.796274 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.796311 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.796375 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.796415 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-config\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.796812 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.811266 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7dnw" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="registry-server" probeResult="failure" output=< Mar 20 15:47:59 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 15:47:59 crc kubenswrapper[4779]: > Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.814654 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj2tr\" (UniqueName: \"kubernetes.io/projected/faefbbf5-3a2e-4938-821d-c5737c066de5-kube-api-access-xj2tr\") pod \"dnsmasq-dns-59cf4bdb65-l62ft\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.820972 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e97b0f9-1415-425f-822b-8a3a9646a534" path="/var/lib/kubelet/pods/9e97b0f9-1415-425f-822b-8a3a9646a534/volumes" Mar 20 15:47:59 crc kubenswrapper[4779]: I0320 15:47:59.866834 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.136068 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567028-jsjd7"] Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.137949 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-jsjd7" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.144529 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.144717 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.145233 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.168377 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-jsjd7"] Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.214288 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l55s\" (UniqueName: \"kubernetes.io/projected/6c7e4dbe-44fb-49cf-93c9-da8e6c247d01-kube-api-access-5l55s\") pod \"auto-csr-approver-29567028-jsjd7\" (UID: \"6c7e4dbe-44fb-49cf-93c9-da8e6c247d01\") " pod="openshift-infra/auto-csr-approver-29567028-jsjd7" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.233076 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f67aa28-790a-4491-bac4-a64e60acf7ef","Type":"ContainerStarted","Data":"385dcc4e9766f65d427e0ca1b356b1aa23e9b651cafc0a8c44b74bbda5e41682"} Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.233140 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f67aa28-790a-4491-bac4-a64e60acf7ef","Type":"ContainerStarted","Data":"27727c13d69b82309e859dacab3a6c322b29cbc7f2019b4446d7297a7fb23fd3"} Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.266691 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.266670334 podStartE2EDuration="2.266670334s" podCreationTimestamp="2026-03-20 15:47:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:48:00.254443736 +0000 UTC m=+1497.216959536" watchObservedRunningTime="2026-03-20 15:48:00.266670334 +0000 UTC m=+1497.229186154" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.316236 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l55s\" (UniqueName: \"kubernetes.io/projected/6c7e4dbe-44fb-49cf-93c9-da8e6c247d01-kube-api-access-5l55s\") pod \"auto-csr-approver-29567028-jsjd7\" (UID: \"6c7e4dbe-44fb-49cf-93c9-da8e6c247d01\") " pod="openshift-infra/auto-csr-approver-29567028-jsjd7" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.336637 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l55s\" (UniqueName: \"kubernetes.io/projected/6c7e4dbe-44fb-49cf-93c9-da8e6c247d01-kube-api-access-5l55s\") pod \"auto-csr-approver-29567028-jsjd7\" (UID: \"6c7e4dbe-44fb-49cf-93c9-da8e6c247d01\") " pod="openshift-infra/auto-csr-approver-29567028-jsjd7" Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.400246 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l62ft"] Mar 20 15:48:00 crc kubenswrapper[4779]: I0320 15:48:00.459696 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-jsjd7" Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.000266 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-jsjd7"] Mar 20 15:48:01 crc kubenswrapper[4779]: W0320 15:48:01.004349 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c7e4dbe_44fb_49cf_93c9_da8e6c247d01.slice/crio-42e43bb1344fd3bd7dfc91f2b523a426e112a27ae5ac8c0ea1aa2cf4ffd7d70d WatchSource:0}: Error finding container 42e43bb1344fd3bd7dfc91f2b523a426e112a27ae5ac8c0ea1aa2cf4ffd7d70d: Status 404 returned error can't find the container with id 42e43bb1344fd3bd7dfc91f2b523a426e112a27ae5ac8c0ea1aa2cf4ffd7d70d Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.247151 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-jsjd7" event={"ID":"6c7e4dbe-44fb-49cf-93c9-da8e6c247d01","Type":"ContainerStarted","Data":"42e43bb1344fd3bd7dfc91f2b523a426e112a27ae5ac8c0ea1aa2cf4ffd7d70d"} Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.254752 4779 generic.go:334] "Generic (PLEG): container finished" podID="faefbbf5-3a2e-4938-821d-c5737c066de5" containerID="1ba3b85913f4db008c02d32f3870cd8e04888a09967b2c9cc6091e21377e1240" exitCode=0 Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.255851 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" event={"ID":"faefbbf5-3a2e-4938-821d-c5737c066de5","Type":"ContainerDied","Data":"1ba3b85913f4db008c02d32f3870cd8e04888a09967b2c9cc6091e21377e1240"} Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.255875 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" event={"ID":"faefbbf5-3a2e-4938-821d-c5737c066de5","Type":"ContainerStarted","Data":"9b0581edb29e1d0a13a2e2a01d5c019938a125e54ab3a0d72ecf2a226551e0df"} Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.820506 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.821211 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="ceilometer-central-agent" containerID="cri-o://e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59" gracePeriod=30 Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.821270 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="sg-core" containerID="cri-o://cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822" gracePeriod=30 Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.821297 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="ceilometer-notification-agent" containerID="cri-o://bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c" gracePeriod=30 Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.821301 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="proxy-httpd" containerID="cri-o://e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1" gracePeriod=30 Mar 20 15:48:01 crc kubenswrapper[4779]: I0320 15:48:01.826016 4779 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.217:3000/\": EOF" Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.117149 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.272125 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" event={"ID":"faefbbf5-3a2e-4938-821d-c5737c066de5","Type":"ContainerStarted","Data":"37ee607ade32a3a551d4da84eb72ab44b0a30e0ebbd98cf77c258470d9127201"} Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.274849 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.279390 4779 generic.go:334] "Generic (PLEG): container finished" podID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerID="e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1" exitCode=0 Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.279422 4779 generic.go:334] "Generic (PLEG): container finished" podID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerID="cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822" exitCode=2 Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.279458 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerDied","Data":"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1"} Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.279497 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerDied","Data":"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822"} Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.279623 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-log" containerID="cri-o://09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6" gracePeriod=30 Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.279650 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-api" containerID="cri-o://935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838" gracePeriod=30 Mar 20 15:48:02 crc kubenswrapper[4779]: I0320 15:48:02.302179 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" podStartSLOduration=3.3021556 podStartE2EDuration="3.3021556s" podCreationTimestamp="2026-03-20 15:47:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:48:02.289190675 +0000 UTC m=+1499.251706475" watchObservedRunningTime="2026-03-20 15:48:02.3021556 +0000 UTC m=+1499.264671490" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.155858 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.292333 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6qhj\" (UniqueName: \"kubernetes.io/projected/216a780a-37b0-4a7a-ace0-4ead2afe215a-kube-api-access-x6qhj\") pod \"216a780a-37b0-4a7a-ace0-4ead2afe215a\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.292525 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-combined-ca-bundle\") pod \"216a780a-37b0-4a7a-ace0-4ead2afe215a\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.292596 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-config-data\") pod \"216a780a-37b0-4a7a-ace0-4ead2afe215a\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.292620 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-sg-core-conf-yaml\") pod \"216a780a-37b0-4a7a-ace0-4ead2afe215a\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.292690 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-run-httpd\") pod \"216a780a-37b0-4a7a-ace0-4ead2afe215a\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.292703 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.292765 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-scripts\") pod \"216a780a-37b0-4a7a-ace0-4ead2afe215a\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.292844 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-log-httpd\") pod \"216a780a-37b0-4a7a-ace0-4ead2afe215a\" (UID: \"216a780a-37b0-4a7a-ace0-4ead2afe215a\") " Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.293194 4779 generic.go:334] "Generic (PLEG): container finished" podID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerID="bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c" exitCode=0 Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.293243 4779 generic.go:334] "Generic (PLEG): container finished" podID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerID="e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59" exitCode=0 Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.293363 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerDied","Data":"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c"} Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.293395 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerDied","Data":"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59"} Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.293409 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"216a780a-37b0-4a7a-ace0-4ead2afe215a","Type":"ContainerDied","Data":"3dc6b2ac12ca230e6910a87e6f8c84a12d062195c027f424025ed1dcaab74644"} Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.293426 4779 scope.go:117] "RemoveContainer" containerID="e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.293616 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "216a780a-37b0-4a7a-ace0-4ead2afe215a" (UID: "216a780a-37b0-4a7a-ace0-4ead2afe215a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.294858 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "216a780a-37b0-4a7a-ace0-4ead2afe215a" (UID: "216a780a-37b0-4a7a-ace0-4ead2afe215a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.297072 4779 generic.go:334] "Generic (PLEG): container finished" podID="6c7e4dbe-44fb-49cf-93c9-da8e6c247d01" containerID="0a7526452a975ab7b35d0acdaa562228f1c087830abbfed42ea3aad56d598b87" exitCode=0 Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.297149 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-jsjd7" event={"ID":"6c7e4dbe-44fb-49cf-93c9-da8e6c247d01","Type":"ContainerDied","Data":"0a7526452a975ab7b35d0acdaa562228f1c087830abbfed42ea3aad56d598b87"} Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.298199 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/216a780a-37b0-4a7a-ace0-4ead2afe215a-kube-api-access-x6qhj" (OuterVolumeSpecName: "kube-api-access-x6qhj") pod "216a780a-37b0-4a7a-ace0-4ead2afe215a" (UID: "216a780a-37b0-4a7a-ace0-4ead2afe215a"). InnerVolumeSpecName "kube-api-access-x6qhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.299183 4779 generic.go:334] "Generic (PLEG): container finished" podID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerID="09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6" exitCode=143 Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.299942 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c20508b-1ebb-4a45-b997-6ae1ceb24012","Type":"ContainerDied","Data":"09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6"} Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.308641 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-scripts" (OuterVolumeSpecName: "scripts") pod "216a780a-37b0-4a7a-ace0-4ead2afe215a" (UID: "216a780a-37b0-4a7a-ace0-4ead2afe215a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.370215 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "216a780a-37b0-4a7a-ace0-4ead2afe215a" (UID: "216a780a-37b0-4a7a-ace0-4ead2afe215a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.394834 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.394860 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.394869 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216a780a-37b0-4a7a-ace0-4ead2afe215a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.394878 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6qhj\" (UniqueName: \"kubernetes.io/projected/216a780a-37b0-4a7a-ace0-4ead2afe215a-kube-api-access-x6qhj\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.394890 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.417525 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "216a780a-37b0-4a7a-ace0-4ead2afe215a" (UID: "216a780a-37b0-4a7a-ace0-4ead2afe215a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.424662 4779 scope.go:117] "RemoveContainer" containerID="cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.452318 4779 scope.go:117] "RemoveContainer" containerID="bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.453505 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-config-data" (OuterVolumeSpecName: "config-data") pod "216a780a-37b0-4a7a-ace0-4ead2afe215a" (UID: "216a780a-37b0-4a7a-ace0-4ead2afe215a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.471703 4779 scope.go:117] "RemoveContainer" containerID="e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.494850 4779 scope.go:117] "RemoveContainer" containerID="e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.497282 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.498176 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216a780a-37b0-4a7a-ace0-4ead2afe215a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:03 crc kubenswrapper[4779]: E0320 15:48:03.498615 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1\": container with ID starting with e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1 not found: ID does not exist" containerID="e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.498663 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1"} err="failed to get container status \"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1\": rpc error: code = NotFound desc = could not find container \"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1\": container with ID starting with e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1 not found: ID does not exist" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.498691 4779 scope.go:117] "RemoveContainer" containerID="cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822" Mar 20 15:48:03 crc kubenswrapper[4779]: E0320 15:48:03.499690 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822\": container with ID starting with cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822 not found: ID does not exist" containerID="cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.499733 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822"} err="failed to get container status \"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822\": rpc error: code = NotFound desc = could not find container \"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822\": container with ID starting with cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822 not found: ID does not exist" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.499761 4779 scope.go:117] "RemoveContainer" containerID="bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c" Mar 20 15:48:03 crc kubenswrapper[4779]: E0320 15:48:03.500400 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c\": container with ID starting with bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c not found: ID does not exist" containerID="bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.500445 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c"} err="failed to get container status \"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c\": rpc error: code = NotFound desc = could not find container \"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c\": container with ID starting with bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c not found: ID does not exist" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.500467 4779 scope.go:117] "RemoveContainer" containerID="e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59" Mar 20 15:48:03 crc kubenswrapper[4779]: E0320 15:48:03.502836 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59\": container with ID starting with e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59 not found: ID does not exist" containerID="e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.502867 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59"} err="failed to get container status \"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59\": rpc error: code = NotFound desc = could not find container \"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59\": container with ID starting with e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59 not found: ID does not exist" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.502886 4779 scope.go:117] "RemoveContainer" containerID="e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.503147 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1"} err="failed to get container status \"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1\": rpc error: code = NotFound desc = could not find container \"e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1\": container with ID starting with e7600344741d6dc3ed75ca90f1845a710f5572a5801ed827e52067cfcdeac8a1 not found: ID does not exist" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.503166 4779 scope.go:117] "RemoveContainer" containerID="cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.503421 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822"} err="failed to get container status \"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822\": rpc error: code = NotFound desc = could not find container \"cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822\": container with ID starting with cf483c2c8724d2401f9f8a90a56b2596fa37062f3e7510c9004052a9aa0d3822 not found: ID does not exist" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.503443 4779 scope.go:117] "RemoveContainer" containerID="bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.503623 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c"} err="failed to get container status \"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c\": rpc error: code = NotFound desc = could not find container \"bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c\": container with ID starting with bc8e41440f1b60bf9bfbe857c8bef096c60c8bdce3690e2ddc2672bc591d9f4c not found: ID does not exist" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.503641 4779 scope.go:117] "RemoveContainer" containerID="e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.503819 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59"} err="failed to get container status \"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59\": rpc error: code = NotFound desc = could not find container \"e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59\": container with ID starting with e903046415f114170edb319a1218886240eaec9154fd8ff8d1c6b56b62433b59 not found: ID does not exist" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.629280 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.640394 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.654727 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:03 crc kubenswrapper[4779]: E0320 15:48:03.655294 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="ceilometer-notification-agent" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.655321 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="ceilometer-notification-agent" Mar 20 15:48:03 crc kubenswrapper[4779]: E0320 15:48:03.655338 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="sg-core" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.655347 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="sg-core" Mar 20 15:48:03 crc kubenswrapper[4779]: E0320 15:48:03.655361 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="proxy-httpd" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.655369 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="proxy-httpd" Mar 20 15:48:03 crc kubenswrapper[4779]: E0320 15:48:03.655392 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="ceilometer-central-agent" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.655400 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="ceilometer-central-agent" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.655638 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="proxy-httpd" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.655661 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="ceilometer-notification-agent" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.655679 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="sg-core" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.655695 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" containerName="ceilometer-central-agent" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.658200 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.661507 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.661708 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.666083 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.802095 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-scripts\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.802203 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swppb\" (UniqueName: \"kubernetes.io/projected/000eb24b-2a93-42f6-ad70-2738972d9ba0-kube-api-access-swppb\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.802249 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.802303 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-config-data\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.802356 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.802378 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.802460 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.825556 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="216a780a-37b0-4a7a-ace0-4ead2afe215a" path="/var/lib/kubelet/pods/216a780a-37b0-4a7a-ace0-4ead2afe215a/volumes" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.899072 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.903901 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-config-data\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.903981 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.904005 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.904056 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.904141 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-scripts\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.904213 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swppb\" (UniqueName: \"kubernetes.io/projected/000eb24b-2a93-42f6-ad70-2738972d9ba0-kube-api-access-swppb\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.904257 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.904713 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.904750 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.906643 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.906650 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.908624 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.917968 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.918304 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-config-data\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.919475 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-scripts\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.922664 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swppb\" (UniqueName: \"kubernetes.io/projected/000eb24b-2a93-42f6-ad70-2738972d9ba0-kube-api-access-swppb\") pod \"ceilometer-0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " pod="openstack/ceilometer-0" Mar 20 15:48:03 crc kubenswrapper[4779]: I0320 15:48:03.973952 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:04 crc kubenswrapper[4779]: I0320 15:48:04.265453 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:04 crc kubenswrapper[4779]: I0320 15:48:04.435860 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:04 crc kubenswrapper[4779]: I0320 15:48:04.678810 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-jsjd7" Mar 20 15:48:04 crc kubenswrapper[4779]: I0320 15:48:04.738479 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l55s\" (UniqueName: \"kubernetes.io/projected/6c7e4dbe-44fb-49cf-93c9-da8e6c247d01-kube-api-access-5l55s\") pod \"6c7e4dbe-44fb-49cf-93c9-da8e6c247d01\" (UID: \"6c7e4dbe-44fb-49cf-93c9-da8e6c247d01\") " Mar 20 15:48:04 crc kubenswrapper[4779]: I0320 15:48:04.744282 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c7e4dbe-44fb-49cf-93c9-da8e6c247d01-kube-api-access-5l55s" (OuterVolumeSpecName: "kube-api-access-5l55s") pod "6c7e4dbe-44fb-49cf-93c9-da8e6c247d01" (UID: "6c7e4dbe-44fb-49cf-93c9-da8e6c247d01"). InnerVolumeSpecName "kube-api-access-5l55s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:04 crc kubenswrapper[4779]: I0320 15:48:04.842050 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l55s\" (UniqueName: \"kubernetes.io/projected/6c7e4dbe-44fb-49cf-93c9-da8e6c247d01-kube-api-access-5l55s\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.322594 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerStarted","Data":"e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189"} Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.323286 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerStarted","Data":"fc398b60acb15a75ad203792866b9ed7f69965116246b927aab8883ba0e03eaf"} Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.324882 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-jsjd7" event={"ID":"6c7e4dbe-44fb-49cf-93c9-da8e6c247d01","Type":"ContainerDied","Data":"42e43bb1344fd3bd7dfc91f2b523a426e112a27ae5ac8c0ea1aa2cf4ffd7d70d"} Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.324948 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42e43bb1344fd3bd7dfc91f2b523a426e112a27ae5ac8c0ea1aa2cf4ffd7d70d" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.324972 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-jsjd7" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.752374 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-8lmhg"] Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.763016 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-8lmhg"] Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.830606 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.833822 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df06bf8e-6468-4c2e-bb61-2aa16f6a7caa" path="/var/lib/kubelet/pods/df06bf8e-6468-4c2e-bb61-2aa16f6a7caa/volumes" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.858674 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-combined-ca-bundle\") pod \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.858765 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhmfs\" (UniqueName: \"kubernetes.io/projected/6c20508b-1ebb-4a45-b997-6ae1ceb24012-kube-api-access-vhmfs\") pod \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.858867 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c20508b-1ebb-4a45-b997-6ae1ceb24012-logs\") pod \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.858902 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-config-data\") pod \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\" (UID: \"6c20508b-1ebb-4a45-b997-6ae1ceb24012\") " Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.861748 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c20508b-1ebb-4a45-b997-6ae1ceb24012-logs" (OuterVolumeSpecName: "logs") pod "6c20508b-1ebb-4a45-b997-6ae1ceb24012" (UID: "6c20508b-1ebb-4a45-b997-6ae1ceb24012"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.881762 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c20508b-1ebb-4a45-b997-6ae1ceb24012-kube-api-access-vhmfs" (OuterVolumeSpecName: "kube-api-access-vhmfs") pod "6c20508b-1ebb-4a45-b997-6ae1ceb24012" (UID: "6c20508b-1ebb-4a45-b997-6ae1ceb24012"). InnerVolumeSpecName "kube-api-access-vhmfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.914496 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-config-data" (OuterVolumeSpecName: "config-data") pod "6c20508b-1ebb-4a45-b997-6ae1ceb24012" (UID: "6c20508b-1ebb-4a45-b997-6ae1ceb24012"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.922096 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c20508b-1ebb-4a45-b997-6ae1ceb24012" (UID: "6c20508b-1ebb-4a45-b997-6ae1ceb24012"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.953562 4779 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podeafa6891-80b8-4c90-93a5-889c82b0ca03"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podeafa6891-80b8-4c90-93a5-889c82b0ca03] : Timed out while waiting for systemd to remove kubepods-besteffort-podeafa6891_80b8_4c90_93a5_889c82b0ca03.slice" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.962625 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.962656 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c20508b-1ebb-4a45-b997-6ae1ceb24012-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.962669 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhmfs\" (UniqueName: \"kubernetes.io/projected/6c20508b-1ebb-4a45-b997-6ae1ceb24012-kube-api-access-vhmfs\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:05 crc kubenswrapper[4779]: I0320 15:48:05.962679 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c20508b-1ebb-4a45-b997-6ae1ceb24012-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.338894 4779 generic.go:334] "Generic (PLEG): container finished" podID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerID="935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838" exitCode=0 Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.339242 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c20508b-1ebb-4a45-b997-6ae1ceb24012","Type":"ContainerDied","Data":"935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838"} Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.339271 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c20508b-1ebb-4a45-b997-6ae1ceb24012","Type":"ContainerDied","Data":"520c257d12a3069580d08526996a9851028d981db5107a9dd0d11b5626129c4b"} Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.339287 4779 scope.go:117] "RemoveContainer" containerID="935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.339392 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.348033 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerStarted","Data":"c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35"} Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.497954 4779 scope.go:117] "RemoveContainer" containerID="09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.531065 4779 scope.go:117] "RemoveContainer" containerID="935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838" Mar 20 15:48:06 crc kubenswrapper[4779]: E0320 15:48:06.531825 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838\": container with ID starting with 935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838 not found: ID does not exist" containerID="935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.531858 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838"} err="failed to get container status \"935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838\": rpc error: code = NotFound desc = could not find container \"935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838\": container with ID starting with 935040be8f380b0f081a6d44c621516e036d0740c76b6e174634c1dfc1486838 not found: ID does not exist" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.531878 4779 scope.go:117] "RemoveContainer" containerID="09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6" Mar 20 15:48:06 crc kubenswrapper[4779]: E0320 15:48:06.532253 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6\": container with ID starting with 09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6 not found: ID does not exist" containerID="09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.532300 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6"} err="failed to get container status \"09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6\": rpc error: code = NotFound desc = could not find container \"09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6\": container with ID starting with 09d261edb1683cc975e685ebd15d0c5cfa60bccf770d34c0247f457c218978f6 not found: ID does not exist" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.534188 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.546979 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.557732 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:06 crc kubenswrapper[4779]: E0320 15:48:06.559153 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-api" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.559173 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-api" Mar 20 15:48:06 crc kubenswrapper[4779]: E0320 15:48:06.559185 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-log" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.559190 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-log" Mar 20 15:48:06 crc kubenswrapper[4779]: E0320 15:48:06.559203 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7e4dbe-44fb-49cf-93c9-da8e6c247d01" containerName="oc" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.559209 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7e4dbe-44fb-49cf-93c9-da8e6c247d01" containerName="oc" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.559406 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c7e4dbe-44fb-49cf-93c9-da8e6c247d01" containerName="oc" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.559454 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-log" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.559543 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" containerName="nova-api-api" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.562943 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.565971 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.566329 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.566418 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.572249 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.675406 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.675477 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-config-data\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.675544 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-logs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.675746 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.675787 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w85s5\" (UniqueName: \"kubernetes.io/projected/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-kube-api-access-w85s5\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.675878 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.777819 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.777858 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w85s5\" (UniqueName: \"kubernetes.io/projected/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-kube-api-access-w85s5\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.777919 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.777972 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.777993 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-config-data\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.778017 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-logs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.778560 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-logs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.781926 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.781958 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.782376 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.783525 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-config-data\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.795485 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w85s5\" (UniqueName: \"kubernetes.io/projected/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-kube-api-access-w85s5\") pod \"nova-api-0\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " pod="openstack/nova-api-0" Mar 20 15:48:06 crc kubenswrapper[4779]: I0320 15:48:06.918375 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:48:07 crc kubenswrapper[4779]: I0320 15:48:07.362367 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:07 crc kubenswrapper[4779]: I0320 15:48:07.366785 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerStarted","Data":"1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2"} Mar 20 15:48:07 crc kubenswrapper[4779]: I0320 15:48:07.821039 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c20508b-1ebb-4a45-b997-6ae1ceb24012" path="/var/lib/kubelet/pods/6c20508b-1ebb-4a45-b997-6ae1ceb24012/volumes" Mar 20 15:48:08 crc kubenswrapper[4779]: I0320 15:48:08.387087 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d","Type":"ContainerStarted","Data":"6e6c26763d47871d7b4f28811a56296ab11704236e40769d374bb5ad230ac1f8"} Mar 20 15:48:08 crc kubenswrapper[4779]: I0320 15:48:08.387432 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d","Type":"ContainerStarted","Data":"3f587f087d6bbfd4b6f4170d8a3bda4f3b119b82832552fd642e56f4974ae8ca"} Mar 20 15:48:08 crc kubenswrapper[4779]: I0320 15:48:08.387447 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d","Type":"ContainerStarted","Data":"18ca909c169f613bbf4e314ef70fe6c332e69754a40739e5914a0d11a9d11c03"} Mar 20 15:48:08 crc kubenswrapper[4779]: I0320 15:48:08.416730 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.416712596 podStartE2EDuration="2.416712596s" podCreationTimestamp="2026-03-20 15:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:48:08.413955409 +0000 UTC m=+1505.376471209" watchObservedRunningTime="2026-03-20 15:48:08.416712596 +0000 UTC m=+1505.379228396" Mar 20 15:48:08 crc kubenswrapper[4779]: I0320 15:48:08.807090 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:48:08 crc kubenswrapper[4779]: I0320 15:48:08.875010 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:48:08 crc kubenswrapper[4779]: I0320 15:48:08.899079 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:48:08 crc kubenswrapper[4779]: I0320 15:48:08.920392 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.047548 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7dnw"] Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.397827 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerStarted","Data":"399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa"} Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.398526 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="proxy-httpd" containerID="cri-o://399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa" gracePeriod=30 Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.398522 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="sg-core" containerID="cri-o://1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2" gracePeriod=30 Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.398528 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="ceilometer-notification-agent" containerID="cri-o://c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35" gracePeriod=30 Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.398691 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="ceilometer-central-agent" containerID="cri-o://e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189" gracePeriod=30 Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.424986 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.597374506 podStartE2EDuration="6.424966598s" podCreationTimestamp="2026-03-20 15:48:03 +0000 UTC" firstStartedPulling="2026-03-20 15:48:04.466409559 +0000 UTC m=+1501.428925359" lastFinishedPulling="2026-03-20 15:48:08.294001651 +0000 UTC m=+1505.256517451" observedRunningTime="2026-03-20 15:48:09.418361117 +0000 UTC m=+1506.380876917" watchObservedRunningTime="2026-03-20 15:48:09.424966598 +0000 UTC m=+1506.387482398" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.436959 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.648753 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vjfs8"] Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.650059 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.653121 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.653782 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.664508 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vjfs8"] Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.842842 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.842963 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-scripts\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.843032 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-config-data\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.843068 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnw6l\" (UniqueName: \"kubernetes.io/projected/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-kube-api-access-rnw6l\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.868208 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.932983 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-qrvwb"] Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.933238 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" podUID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" containerName="dnsmasq-dns" containerID="cri-o://891d90ac45141853aababc917623362409aa5c4133c041f2fde985e6cb7f332d" gracePeriod=10 Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.945262 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-scripts\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.945358 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-config-data\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.945402 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnw6l\" (UniqueName: \"kubernetes.io/projected/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-kube-api-access-rnw6l\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.945484 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.955987 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-config-data\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.956171 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-scripts\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.961409 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:09 crc kubenswrapper[4779]: I0320 15:48:09.965400 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnw6l\" (UniqueName: \"kubernetes.io/projected/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-kube-api-access-rnw6l\") pod \"nova-cell1-cell-mapping-vjfs8\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.274639 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.424161 4779 generic.go:334] "Generic (PLEG): container finished" podID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerID="399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa" exitCode=0 Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.424645 4779 generic.go:334] "Generic (PLEG): container finished" podID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerID="1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2" exitCode=2 Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.424653 4779 generic.go:334] "Generic (PLEG): container finished" podID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerID="c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35" exitCode=0 Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.424701 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerDied","Data":"399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa"} Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.424727 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerDied","Data":"1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2"} Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.424736 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerDied","Data":"c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35"} Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.436581 4779 generic.go:334] "Generic (PLEG): container finished" podID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" containerID="891d90ac45141853aababc917623362409aa5c4133c041f2fde985e6cb7f332d" exitCode=0 Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.436637 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" event={"ID":"52109ecb-141a-4d8d-95db-5fbc7275f7f1","Type":"ContainerDied","Data":"891d90ac45141853aababc917623362409aa5c4133c041f2fde985e6cb7f332d"} Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.437168 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f7dnw" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="registry-server" containerID="cri-o://e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922" gracePeriod=2 Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.491956 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.660444 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-swift-storage-0\") pod \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.660560 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-sb\") pod \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.660614 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-svc\") pod \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.660665 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjbb8\" (UniqueName: \"kubernetes.io/projected/52109ecb-141a-4d8d-95db-5fbc7275f7f1-kube-api-access-rjbb8\") pod \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.660705 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-config\") pod \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.660878 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-nb\") pod \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\" (UID: \"52109ecb-141a-4d8d-95db-5fbc7275f7f1\") " Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.674333 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52109ecb-141a-4d8d-95db-5fbc7275f7f1-kube-api-access-rjbb8" (OuterVolumeSpecName: "kube-api-access-rjbb8") pod "52109ecb-141a-4d8d-95db-5fbc7275f7f1" (UID: "52109ecb-141a-4d8d-95db-5fbc7275f7f1"). InnerVolumeSpecName "kube-api-access-rjbb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.739489 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "52109ecb-141a-4d8d-95db-5fbc7275f7f1" (UID: "52109ecb-141a-4d8d-95db-5fbc7275f7f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.749063 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "52109ecb-141a-4d8d-95db-5fbc7275f7f1" (UID: "52109ecb-141a-4d8d-95db-5fbc7275f7f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.758953 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-config" (OuterVolumeSpecName: "config") pod "52109ecb-141a-4d8d-95db-5fbc7275f7f1" (UID: "52109ecb-141a-4d8d-95db-5fbc7275f7f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.763010 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.763052 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.763061 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjbb8\" (UniqueName: \"kubernetes.io/projected/52109ecb-141a-4d8d-95db-5fbc7275f7f1-kube-api-access-rjbb8\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.763071 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.763799 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "52109ecb-141a-4d8d-95db-5fbc7275f7f1" (UID: "52109ecb-141a-4d8d-95db-5fbc7275f7f1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.774579 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "52109ecb-141a-4d8d-95db-5fbc7275f7f1" (UID: "52109ecb-141a-4d8d-95db-5fbc7275f7f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.793209 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vjfs8"] Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.868627 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.868668 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52109ecb-141a-4d8d-95db-5fbc7275f7f1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:10 crc kubenswrapper[4779]: I0320 15:48:10.975429 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.071261 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-catalog-content\") pod \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.071356 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-utilities\") pod \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.071385 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7zjl\" (UniqueName: \"kubernetes.io/projected/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-kube-api-access-w7zjl\") pod \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\" (UID: \"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978\") " Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.072161 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-utilities" (OuterVolumeSpecName: "utilities") pod "cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" (UID: "cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.075340 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-kube-api-access-w7zjl" (OuterVolumeSpecName: "kube-api-access-w7zjl") pod "cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" (UID: "cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978"). InnerVolumeSpecName "kube-api-access-w7zjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.174237 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.174288 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7zjl\" (UniqueName: \"kubernetes.io/projected/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-kube-api-access-w7zjl\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.198514 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" (UID: "cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.276344 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.448072 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" event={"ID":"52109ecb-141a-4d8d-95db-5fbc7275f7f1","Type":"ContainerDied","Data":"0986f34f67e867ac6efd7ca3c13f728493d6e31c5fe2e4e62568e324b3ad4671"} Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.448163 4779 scope.go:117] "RemoveContainer" containerID="891d90ac45141853aababc917623362409aa5c4133c041f2fde985e6cb7f332d" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.448096 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-qrvwb" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.451870 4779 generic.go:334] "Generic (PLEG): container finished" podID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerID="e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922" exitCode=0 Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.451954 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7dnw" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.451943 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7dnw" event={"ID":"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978","Type":"ContainerDied","Data":"e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922"} Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.452083 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7dnw" event={"ID":"cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978","Type":"ContainerDied","Data":"bec3f6dd789000765b1f25263b38cc149a0f42564d4d465cec276afced1b83e2"} Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.453447 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vjfs8" event={"ID":"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57","Type":"ContainerStarted","Data":"78b4899d7c894c3bee42d4192002ebcb53d21bb0825a660599d7ffafc76aa535"} Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.453488 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vjfs8" event={"ID":"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57","Type":"ContainerStarted","Data":"96e579de45c391415d51cc29a4feed0f18a5132155d6f6b3857d4703b7d99236"} Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.467346 4779 scope.go:117] "RemoveContainer" containerID="f72088e62a7d2351bbdc2964217e31962afef6df9ccee18b76450df24168acca" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.483317 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vjfs8" podStartSLOduration=2.48329756 podStartE2EDuration="2.48329756s" podCreationTimestamp="2026-03-20 15:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:48:11.477038338 +0000 UTC m=+1508.439554138" watchObservedRunningTime="2026-03-20 15:48:11.48329756 +0000 UTC m=+1508.445813360" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.516183 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-qrvwb"] Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.517404 4779 scope.go:117] "RemoveContainer" containerID="e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.525224 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-qrvwb"] Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.535177 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7dnw"] Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.542130 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f7dnw"] Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.552408 4779 scope.go:117] "RemoveContainer" containerID="6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.645019 4779 scope.go:117] "RemoveContainer" containerID="7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.786151 4779 scope.go:117] "RemoveContainer" containerID="e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922" Mar 20 15:48:11 crc kubenswrapper[4779]: E0320 15:48:11.786572 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922\": container with ID starting with e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922 not found: ID does not exist" containerID="e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.786610 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922"} err="failed to get container status \"e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922\": rpc error: code = NotFound desc = could not find container \"e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922\": container with ID starting with e337b82b0019cb1fd626794a9ce15b5be48dd875a9ce4b59a715e2a8d5791922 not found: ID does not exist" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.786634 4779 scope.go:117] "RemoveContainer" containerID="6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd" Mar 20 15:48:11 crc kubenswrapper[4779]: E0320 15:48:11.787016 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd\": container with ID starting with 6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd not found: ID does not exist" containerID="6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.787041 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd"} err="failed to get container status \"6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd\": rpc error: code = NotFound desc = could not find container \"6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd\": container with ID starting with 6f7654598feaad1eaeaf6380851705e15d6bc41c739dc7208137fa4aaeb6edcd not found: ID does not exist" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.787062 4779 scope.go:117] "RemoveContainer" containerID="7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732" Mar 20 15:48:11 crc kubenswrapper[4779]: E0320 15:48:11.787487 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732\": container with ID starting with 7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732 not found: ID does not exist" containerID="7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.787516 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732"} err="failed to get container status \"7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732\": rpc error: code = NotFound desc = could not find container \"7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732\": container with ID starting with 7afdc016e90d8740063a3aa8bc504ec8fd43440f4a2421f6b976e55308d51732 not found: ID does not exist" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.821848 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" path="/var/lib/kubelet/pods/52109ecb-141a-4d8d-95db-5fbc7275f7f1/volumes" Mar 20 15:48:11 crc kubenswrapper[4779]: I0320 15:48:11.822572 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" path="/var/lib/kubelet/pods/cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978/volumes" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.055595 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.196731 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-combined-ca-bundle\") pod \"000eb24b-2a93-42f6-ad70-2738972d9ba0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.196770 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-run-httpd\") pod \"000eb24b-2a93-42f6-ad70-2738972d9ba0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.196803 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-sg-core-conf-yaml\") pod \"000eb24b-2a93-42f6-ad70-2738972d9ba0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.196837 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-config-data\") pod \"000eb24b-2a93-42f6-ad70-2738972d9ba0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.196868 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-log-httpd\") pod \"000eb24b-2a93-42f6-ad70-2738972d9ba0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.197021 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swppb\" (UniqueName: \"kubernetes.io/projected/000eb24b-2a93-42f6-ad70-2738972d9ba0-kube-api-access-swppb\") pod \"000eb24b-2a93-42f6-ad70-2738972d9ba0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.197078 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-scripts\") pod \"000eb24b-2a93-42f6-ad70-2738972d9ba0\" (UID: \"000eb24b-2a93-42f6-ad70-2738972d9ba0\") " Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.197273 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "000eb24b-2a93-42f6-ad70-2738972d9ba0" (UID: "000eb24b-2a93-42f6-ad70-2738972d9ba0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.197409 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "000eb24b-2a93-42f6-ad70-2738972d9ba0" (UID: "000eb24b-2a93-42f6-ad70-2738972d9ba0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.197616 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.197634 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/000eb24b-2a93-42f6-ad70-2738972d9ba0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.202225 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-scripts" (OuterVolumeSpecName: "scripts") pod "000eb24b-2a93-42f6-ad70-2738972d9ba0" (UID: "000eb24b-2a93-42f6-ad70-2738972d9ba0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.202472 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000eb24b-2a93-42f6-ad70-2738972d9ba0-kube-api-access-swppb" (OuterVolumeSpecName: "kube-api-access-swppb") pod "000eb24b-2a93-42f6-ad70-2738972d9ba0" (UID: "000eb24b-2a93-42f6-ad70-2738972d9ba0"). InnerVolumeSpecName "kube-api-access-swppb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.223361 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "000eb24b-2a93-42f6-ad70-2738972d9ba0" (UID: "000eb24b-2a93-42f6-ad70-2738972d9ba0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.287946 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-config-data" (OuterVolumeSpecName: "config-data") pod "000eb24b-2a93-42f6-ad70-2738972d9ba0" (UID: "000eb24b-2a93-42f6-ad70-2738972d9ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.294923 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "000eb24b-2a93-42f6-ad70-2738972d9ba0" (UID: "000eb24b-2a93-42f6-ad70-2738972d9ba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.299140 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.299161 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.299172 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.299181 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swppb\" (UniqueName: \"kubernetes.io/projected/000eb24b-2a93-42f6-ad70-2738972d9ba0-kube-api-access-swppb\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.299410 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000eb24b-2a93-42f6-ad70-2738972d9ba0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.465643 4779 generic.go:334] "Generic (PLEG): container finished" podID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerID="e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189" exitCode=0 Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.465722 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerDied","Data":"e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189"} Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.465753 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"000eb24b-2a93-42f6-ad70-2738972d9ba0","Type":"ContainerDied","Data":"fc398b60acb15a75ad203792866b9ed7f69965116246b927aab8883ba0e03eaf"} Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.465774 4779 scope.go:117] "RemoveContainer" containerID="399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.465907 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.497512 4779 scope.go:117] "RemoveContainer" containerID="1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.512066 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.529577 4779 scope.go:117] "RemoveContainer" containerID="c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.558999 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.576466 4779 scope.go:117] "RemoveContainer" containerID="e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.583129 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.583940 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="ceilometer-notification-agent" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.583965 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="ceilometer-notification-agent" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.583991 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="registry-server" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.583999 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="registry-server" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.584020 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="extract-utilities" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584027 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="extract-utilities" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.584045 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" containerName="init" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584053 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" containerName="init" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.584071 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="proxy-httpd" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584078 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="proxy-httpd" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.584089 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" containerName="dnsmasq-dns" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584096 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" containerName="dnsmasq-dns" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.584120 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="extract-content" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584126 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="extract-content" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.584140 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="sg-core" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584147 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="sg-core" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.584164 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="ceilometer-central-agent" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584169 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="ceilometer-central-agent" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584541 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="52109ecb-141a-4d8d-95db-5fbc7275f7f1" containerName="dnsmasq-dns" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584560 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="ceilometer-central-agent" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584599 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="ceilometer-notification-agent" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584616 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="sg-core" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584641 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" containerName="proxy-httpd" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.584648 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4ad3fa-7f7f-4d05-9a9f-2d95c16e0978" containerName="registry-server" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.588499 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.590898 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.590911 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.596425 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.646888 4779 scope.go:117] "RemoveContainer" containerID="399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.647324 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa\": container with ID starting with 399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa not found: ID does not exist" containerID="399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.647356 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa"} err="failed to get container status \"399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa\": rpc error: code = NotFound desc = could not find container \"399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa\": container with ID starting with 399f8fbe146f7599eebc475e689a3d7334e158a68c40033587172e65be1e37fa not found: ID does not exist" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.647375 4779 scope.go:117] "RemoveContainer" containerID="1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.647732 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2\": container with ID starting with 1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2 not found: ID does not exist" containerID="1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.647770 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2"} err="failed to get container status \"1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2\": rpc error: code = NotFound desc = could not find container \"1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2\": container with ID starting with 1ed86b45db2eded41c86ea3cb4d8f272d6503ab41144758d5b71c74c224c87a2 not found: ID does not exist" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.647796 4779 scope.go:117] "RemoveContainer" containerID="c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.648021 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35\": container with ID starting with c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35 not found: ID does not exist" containerID="c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.648039 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35"} err="failed to get container status \"c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35\": rpc error: code = NotFound desc = could not find container \"c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35\": container with ID starting with c246bcde73bda6bc7b94cf1b319fb0be23c59e2d8f21af60c432e7bbd65e2d35 not found: ID does not exist" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.648054 4779 scope.go:117] "RemoveContainer" containerID="e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189" Mar 20 15:48:12 crc kubenswrapper[4779]: E0320 15:48:12.648280 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189\": container with ID starting with e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189 not found: ID does not exist" containerID="e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.648299 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189"} err="failed to get container status \"e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189\": rpc error: code = NotFound desc = could not find container \"e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189\": container with ID starting with e51bbccd6eee280bead8cd02e1082776920e4ad1d437b9184a1de68d901bd189 not found: ID does not exist" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.706727 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzsf6\" (UniqueName: \"kubernetes.io/projected/87fed41c-5259-4a97-9a57-72cab891f2c6-kube-api-access-tzsf6\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.706981 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-config-data\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.707000 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.707033 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-log-httpd\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.707052 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.707134 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-scripts\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.707306 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-run-httpd\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.808873 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-run-httpd\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.808998 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzsf6\" (UniqueName: \"kubernetes.io/projected/87fed41c-5259-4a97-9a57-72cab891f2c6-kube-api-access-tzsf6\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.809025 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-config-data\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.809047 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.809085 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-log-httpd\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.809132 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.809208 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-scripts\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.809749 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-run-httpd\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.810635 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-log-httpd\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.813672 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.815135 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-config-data\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.815697 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.821758 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-scripts\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.842368 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzsf6\" (UniqueName: \"kubernetes.io/projected/87fed41c-5259-4a97-9a57-72cab891f2c6-kube-api-access-tzsf6\") pod \"ceilometer-0\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " pod="openstack/ceilometer-0" Mar 20 15:48:12 crc kubenswrapper[4779]: I0320 15:48:12.948063 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:13 crc kubenswrapper[4779]: W0320 15:48:13.415008 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87fed41c_5259_4a97_9a57_72cab891f2c6.slice/crio-9a7f3d91c4b5509984cf04a79760b0e26020d4b7dd77471b8dbe1327c2ac8fb8 WatchSource:0}: Error finding container 9a7f3d91c4b5509984cf04a79760b0e26020d4b7dd77471b8dbe1327c2ac8fb8: Status 404 returned error can't find the container with id 9a7f3d91c4b5509984cf04a79760b0e26020d4b7dd77471b8dbe1327c2ac8fb8 Mar 20 15:48:13 crc kubenswrapper[4779]: I0320 15:48:13.419648 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:13 crc kubenswrapper[4779]: I0320 15:48:13.484051 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerStarted","Data":"9a7f3d91c4b5509984cf04a79760b0e26020d4b7dd77471b8dbe1327c2ac8fb8"} Mar 20 15:48:13 crc kubenswrapper[4779]: I0320 15:48:13.840618 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000eb24b-2a93-42f6-ad70-2738972d9ba0" path="/var/lib/kubelet/pods/000eb24b-2a93-42f6-ad70-2738972d9ba0/volumes" Mar 20 15:48:14 crc kubenswrapper[4779]: I0320 15:48:14.499024 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerStarted","Data":"460b3950fe54e1bf7b52711deec65bf831905d913a21ac9016c3781222c4138c"} Mar 20 15:48:15 crc kubenswrapper[4779]: I0320 15:48:15.509967 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerStarted","Data":"c21a7a9c12967adc9cfc407de35cfb107a6a5acf6360931bef29e38c8d8ae78f"} Mar 20 15:48:15 crc kubenswrapper[4779]: I0320 15:48:15.510348 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerStarted","Data":"045f64d9d747c935724738fb6e73cd70b20734787db2e7f85c3e31c75ff95c45"} Mar 20 15:48:16 crc kubenswrapper[4779]: I0320 15:48:16.523580 4779 generic.go:334] "Generic (PLEG): container finished" podID="78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" containerID="78b4899d7c894c3bee42d4192002ebcb53d21bb0825a660599d7ffafc76aa535" exitCode=0 Mar 20 15:48:16 crc kubenswrapper[4779]: I0320 15:48:16.523695 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vjfs8" event={"ID":"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57","Type":"ContainerDied","Data":"78b4899d7c894c3bee42d4192002ebcb53d21bb0825a660599d7ffafc76aa535"} Mar 20 15:48:16 crc kubenswrapper[4779]: I0320 15:48:16.919398 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:48:16 crc kubenswrapper[4779]: I0320 15:48:16.919443 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:48:17 crc kubenswrapper[4779]: I0320 15:48:17.933290 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:48:17 crc kubenswrapper[4779]: I0320 15:48:17.933662 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.058270 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.094607 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-combined-ca-bundle\") pod \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.094650 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-config-data\") pod \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.094686 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-scripts\") pod \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.094874 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnw6l\" (UniqueName: \"kubernetes.io/projected/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-kube-api-access-rnw6l\") pod \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\" (UID: \"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57\") " Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.101538 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-scripts" (OuterVolumeSpecName: "scripts") pod "78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" (UID: "78be39d2-e84b-42b6-a4d4-7abaeaaf1a57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.101861 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-kube-api-access-rnw6l" (OuterVolumeSpecName: "kube-api-access-rnw6l") pod "78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" (UID: "78be39d2-e84b-42b6-a4d4-7abaeaaf1a57"). InnerVolumeSpecName "kube-api-access-rnw6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.128435 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" (UID: "78be39d2-e84b-42b6-a4d4-7abaeaaf1a57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.131853 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-config-data" (OuterVolumeSpecName: "config-data") pod "78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" (UID: "78be39d2-e84b-42b6-a4d4-7abaeaaf1a57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.197255 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnw6l\" (UniqueName: \"kubernetes.io/projected/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-kube-api-access-rnw6l\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.197297 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.197308 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.197319 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.551147 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vjfs8" event={"ID":"78be39d2-e84b-42b6-a4d4-7abaeaaf1a57","Type":"ContainerDied","Data":"96e579de45c391415d51cc29a4feed0f18a5132155d6f6b3857d4703b7d99236"} Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.551206 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e579de45c391415d51cc29a4feed0f18a5132155d6f6b3857d4703b7d99236" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.551170 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vjfs8" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.556510 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerStarted","Data":"edc89d1b74ad2b786402beda3335225dcb2994d707bb64bf54fa87bdac037226"} Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.556725 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:48:18 crc kubenswrapper[4779]: I0320 15:48:18.589280 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.537560235 podStartE2EDuration="6.589263298s" podCreationTimestamp="2026-03-20 15:48:12 +0000 UTC" firstStartedPulling="2026-03-20 15:48:13.417476633 +0000 UTC m=+1510.379992433" lastFinishedPulling="2026-03-20 15:48:17.469179706 +0000 UTC m=+1514.431695496" observedRunningTime="2026-03-20 15:48:18.58149693 +0000 UTC m=+1515.544012750" watchObservedRunningTime="2026-03-20 15:48:18.589263298 +0000 UTC m=+1515.551779098" Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.131302 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.132687 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-log" containerID="cri-o://3f587f087d6bbfd4b6f4170d8a3bda4f3b119b82832552fd642e56f4974ae8ca" gracePeriod=30 Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.133392 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-api" containerID="cri-o://6e6c26763d47871d7b4f28811a56296ab11704236e40769d374bb5ad230ac1f8" gracePeriod=30 Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.142620 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.142813 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="57169c3f-afba-425a-9c80-a7698c0151ce" containerName="nova-scheduler-scheduler" containerID="cri-o://307555d734d8c31719459a6fab20efd0f6af775f1498d19e9deb098d8b393047" gracePeriod=30 Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.197064 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.197652 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-metadata" containerID="cri-o://c74700b8bd64f8c4618fc6dd8e8c884983ee239ea75c2962ec4b8dec07a50c97" gracePeriod=30 Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.198349 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-log" containerID="cri-o://531d688066602d92287c84d347ac4e65aedfd23f1b0a9b6c9a2fdb395a9e1425" gracePeriod=30 Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.570371 4779 generic.go:334] "Generic (PLEG): container finished" podID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerID="531d688066602d92287c84d347ac4e65aedfd23f1b0a9b6c9a2fdb395a9e1425" exitCode=143 Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.570417 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bc56880-51cd-416e-a08c-1584fd7b5091","Type":"ContainerDied","Data":"531d688066602d92287c84d347ac4e65aedfd23f1b0a9b6c9a2fdb395a9e1425"} Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.572984 4779 generic.go:334] "Generic (PLEG): container finished" podID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerID="3f587f087d6bbfd4b6f4170d8a3bda4f3b119b82832552fd642e56f4974ae8ca" exitCode=143 Mar 20 15:48:19 crc kubenswrapper[4779]: I0320 15:48:19.573063 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d","Type":"ContainerDied","Data":"3f587f087d6bbfd4b6f4170d8a3bda4f3b119b82832552fd642e56f4974ae8ca"} Mar 20 15:48:19 crc kubenswrapper[4779]: E0320 15:48:19.698648 4779 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307555d734d8c31719459a6fab20efd0f6af775f1498d19e9deb098d8b393047" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:48:19 crc kubenswrapper[4779]: E0320 15:48:19.699986 4779 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307555d734d8c31719459a6fab20efd0f6af775f1498d19e9deb098d8b393047" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:48:19 crc kubenswrapper[4779]: E0320 15:48:19.701023 4779 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307555d734d8c31719459a6fab20efd0f6af775f1498d19e9deb098d8b393047" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:48:19 crc kubenswrapper[4779]: E0320 15:48:19.701052 4779 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="57169c3f-afba-425a-9c80-a7698c0151ce" containerName="nova-scheduler-scheduler" Mar 20 15:48:22 crc kubenswrapper[4779]: I0320 15:48:22.601829 4779 generic.go:334] "Generic (PLEG): container finished" podID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerID="c74700b8bd64f8c4618fc6dd8e8c884983ee239ea75c2962ec4b8dec07a50c97" exitCode=0 Mar 20 15:48:22 crc kubenswrapper[4779]: I0320 15:48:22.601929 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bc56880-51cd-416e-a08c-1584fd7b5091","Type":"ContainerDied","Data":"c74700b8bd64f8c4618fc6dd8e8c884983ee239ea75c2962ec4b8dec07a50c97"} Mar 20 15:48:22 crc kubenswrapper[4779]: I0320 15:48:22.834162 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.034146 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84trj\" (UniqueName: \"kubernetes.io/projected/0bc56880-51cd-416e-a08c-1584fd7b5091-kube-api-access-84trj\") pod \"0bc56880-51cd-416e-a08c-1584fd7b5091\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.034526 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-combined-ca-bundle\") pod \"0bc56880-51cd-416e-a08c-1584fd7b5091\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.034576 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc56880-51cd-416e-a08c-1584fd7b5091-logs\") pod \"0bc56880-51cd-416e-a08c-1584fd7b5091\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.035203 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bc56880-51cd-416e-a08c-1584fd7b5091-logs" (OuterVolumeSpecName: "logs") pod "0bc56880-51cd-416e-a08c-1584fd7b5091" (UID: "0bc56880-51cd-416e-a08c-1584fd7b5091"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.035299 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-nova-metadata-tls-certs\") pod \"0bc56880-51cd-416e-a08c-1584fd7b5091\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.035524 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-config-data\") pod \"0bc56880-51cd-416e-a08c-1584fd7b5091\" (UID: \"0bc56880-51cd-416e-a08c-1584fd7b5091\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.035793 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc56880-51cd-416e-a08c-1584fd7b5091-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.050365 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc56880-51cd-416e-a08c-1584fd7b5091-kube-api-access-84trj" (OuterVolumeSpecName: "kube-api-access-84trj") pod "0bc56880-51cd-416e-a08c-1584fd7b5091" (UID: "0bc56880-51cd-416e-a08c-1584fd7b5091"). InnerVolumeSpecName "kube-api-access-84trj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.068347 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bc56880-51cd-416e-a08c-1584fd7b5091" (UID: "0bc56880-51cd-416e-a08c-1584fd7b5091"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.069150 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-config-data" (OuterVolumeSpecName: "config-data") pod "0bc56880-51cd-416e-a08c-1584fd7b5091" (UID: "0bc56880-51cd-416e-a08c-1584fd7b5091"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.096830 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0bc56880-51cd-416e-a08c-1584fd7b5091" (UID: "0bc56880-51cd-416e-a08c-1584fd7b5091"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.139498 4779 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.139540 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.139550 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84trj\" (UniqueName: \"kubernetes.io/projected/0bc56880-51cd-416e-a08c-1584fd7b5091-kube-api-access-84trj\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.139558 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc56880-51cd-416e-a08c-1584fd7b5091-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.616417 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bc56880-51cd-416e-a08c-1584fd7b5091","Type":"ContainerDied","Data":"513663f8832fa736d5238ee5ec7978a330c7b7f2aafd54d9c14aa4303709543a"} Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.616468 4779 scope.go:117] "RemoveContainer" containerID="c74700b8bd64f8c4618fc6dd8e8c884983ee239ea75c2962ec4b8dec07a50c97" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.616463 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.618668 4779 generic.go:334] "Generic (PLEG): container finished" podID="57169c3f-afba-425a-9c80-a7698c0151ce" containerID="307555d734d8c31719459a6fab20efd0f6af775f1498d19e9deb098d8b393047" exitCode=0 Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.618763 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"57169c3f-afba-425a-9c80-a7698c0151ce","Type":"ContainerDied","Data":"307555d734d8c31719459a6fab20efd0f6af775f1498d19e9deb098d8b393047"} Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.618794 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"57169c3f-afba-425a-9c80-a7698c0151ce","Type":"ContainerDied","Data":"cca43740745b6fd0b4b203f5c1aa0d64dcd210b976fbfd52da6f5fa7ac54e9d4"} Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.618805 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca43740745b6fd0b4b203f5c1aa0d64dcd210b976fbfd52da6f5fa7ac54e9d4" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.621570 4779 generic.go:334] "Generic (PLEG): container finished" podID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerID="6e6c26763d47871d7b4f28811a56296ab11704236e40769d374bb5ad230ac1f8" exitCode=0 Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.621622 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d","Type":"ContainerDied","Data":"6e6c26763d47871d7b4f28811a56296ab11704236e40769d374bb5ad230ac1f8"} Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.695050 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.695361 4779 scope.go:117] "RemoveContainer" containerID="531d688066602d92287c84d347ac4e65aedfd23f1b0a9b6c9a2fdb395a9e1425" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.725008 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.792300 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806077 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:48:23 crc kubenswrapper[4779]: E0320 15:48:23.806607 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-metadata" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806630 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-metadata" Mar 20 15:48:23 crc kubenswrapper[4779]: E0320 15:48:23.806655 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-log" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806663 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-log" Mar 20 15:48:23 crc kubenswrapper[4779]: E0320 15:48:23.806676 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57169c3f-afba-425a-9c80-a7698c0151ce" containerName="nova-scheduler-scheduler" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806684 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="57169c3f-afba-425a-9c80-a7698c0151ce" containerName="nova-scheduler-scheduler" Mar 20 15:48:23 crc kubenswrapper[4779]: E0320 15:48:23.806709 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" containerName="nova-manage" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806716 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" containerName="nova-manage" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806926 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="57169c3f-afba-425a-9c80-a7698c0151ce" containerName="nova-scheduler-scheduler" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806954 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" containerName="nova-manage" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806966 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-log" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.806984 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" containerName="nova-metadata-metadata" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.808842 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.823521 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.823815 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.826239 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc56880-51cd-416e-a08c-1584fd7b5091" path="/var/lib/kubelet/pods/0bc56880-51cd-416e-a08c-1584fd7b5091/volumes" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.835887 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.875372 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-config-data\") pod \"57169c3f-afba-425a-9c80-a7698c0151ce\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.875470 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-combined-ca-bundle\") pod \"57169c3f-afba-425a-9c80-a7698c0151ce\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.875588 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfndd\" (UniqueName: \"kubernetes.io/projected/57169c3f-afba-425a-9c80-a7698c0151ce-kube-api-access-lfndd\") pod \"57169c3f-afba-425a-9c80-a7698c0151ce\" (UID: \"57169c3f-afba-425a-9c80-a7698c0151ce\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.875996 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76d4\" (UniqueName: \"kubernetes.io/projected/a790ab0f-1ec3-486a-95bf-932d6c088c08-kube-api-access-x76d4\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.876030 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.876074 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a790ab0f-1ec3-486a-95bf-932d6c088c08-logs\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.876096 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-config-data\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.876155 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.896207 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57169c3f-afba-425a-9c80-a7698c0151ce-kube-api-access-lfndd" (OuterVolumeSpecName: "kube-api-access-lfndd") pod "57169c3f-afba-425a-9c80-a7698c0151ce" (UID: "57169c3f-afba-425a-9c80-a7698c0151ce"). InnerVolumeSpecName "kube-api-access-lfndd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.913745 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57169c3f-afba-425a-9c80-a7698c0151ce" (UID: "57169c3f-afba-425a-9c80-a7698c0151ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.915536 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-config-data" (OuterVolumeSpecName: "config-data") pod "57169c3f-afba-425a-9c80-a7698c0151ce" (UID: "57169c3f-afba-425a-9c80-a7698c0151ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.969864 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.977350 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-public-tls-certs\") pod \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.977466 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-config-data\") pod \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.977547 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-logs\") pod \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.977610 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-combined-ca-bundle\") pod \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.977657 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w85s5\" (UniqueName: \"kubernetes.io/projected/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-kube-api-access-w85s5\") pod \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.977680 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-internal-tls-certs\") pod \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\" (UID: \"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d\") " Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.977912 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.978009 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76d4\" (UniqueName: \"kubernetes.io/projected/a790ab0f-1ec3-486a-95bf-932d6c088c08-kube-api-access-x76d4\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.978041 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.978090 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a790ab0f-1ec3-486a-95bf-932d6c088c08-logs\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.978126 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-config-data\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.978206 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfndd\" (UniqueName: \"kubernetes.io/projected/57169c3f-afba-425a-9c80-a7698c0151ce-kube-api-access-lfndd\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.978221 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.978233 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57169c3f-afba-425a-9c80-a7698c0151ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.979366 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-logs" (OuterVolumeSpecName: "logs") pod "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" (UID: "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.979846 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a790ab0f-1ec3-486a-95bf-932d6c088c08-logs\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.982600 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-kube-api-access-w85s5" (OuterVolumeSpecName: "kube-api-access-w85s5") pod "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" (UID: "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d"). InnerVolumeSpecName "kube-api-access-w85s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.982908 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-config-data\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.982957 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.984827 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a790ab0f-1ec3-486a-95bf-932d6c088c08-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:23 crc kubenswrapper[4779]: I0320 15:48:23.999767 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76d4\" (UniqueName: \"kubernetes.io/projected/a790ab0f-1ec3-486a-95bf-932d6c088c08-kube-api-access-x76d4\") pod \"nova-metadata-0\" (UID: \"a790ab0f-1ec3-486a-95bf-932d6c088c08\") " pod="openstack/nova-metadata-0" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.015190 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" (UID: "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.028429 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-config-data" (OuterVolumeSpecName: "config-data") pod "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" (UID: "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.052341 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" (UID: "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.052580 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" (UID: "0e66aec5-6c2e-4935-9539-e5d3fc76ef5d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.080227 4779 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.080264 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.080274 4779 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.080282 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.080290 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w85s5\" (UniqueName: \"kubernetes.io/projected/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-kube-api-access-w85s5\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.080301 4779 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.145908 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.637136 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e66aec5-6c2e-4935-9539-e5d3fc76ef5d","Type":"ContainerDied","Data":"18ca909c169f613bbf4e314ef70fe6c332e69754a40739e5914a0d11a9d11c03"} Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.637163 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.637198 4779 scope.go:117] "RemoveContainer" containerID="6e6c26763d47871d7b4f28811a56296ab11704236e40769d374bb5ad230ac1f8" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.642064 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.644539 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:48:24 crc kubenswrapper[4779]: W0320 15:48:24.646275 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda790ab0f_1ec3_486a_95bf_932d6c088c08.slice/crio-ae0a81a548b26d13465912384ce21322e7eef8b3c313ac2d39618592ddab565c WatchSource:0}: Error finding container ae0a81a548b26d13465912384ce21322e7eef8b3c313ac2d39618592ddab565c: Status 404 returned error can't find the container with id ae0a81a548b26d13465912384ce21322e7eef8b3c313ac2d39618592ddab565c Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.706426 4779 scope.go:117] "RemoveContainer" containerID="3f587f087d6bbfd4b6f4170d8a3bda4f3b119b82832552fd642e56f4974ae8ca" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.771716 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.790739 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.808262 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.832695 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.845433 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:24 crc kubenswrapper[4779]: E0320 15:48:24.845883 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-log" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.845922 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-log" Mar 20 15:48:24 crc kubenswrapper[4779]: E0320 15:48:24.845966 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-api" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.845972 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-api" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.846160 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-api" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.846204 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" containerName="nova-api-log" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.848155 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.849805 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.850586 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.851403 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.856070 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.880950 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.893085 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.900639 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 15:48:24 crc kubenswrapper[4779]: I0320 15:48:24.902617 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.005798 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-config-data\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.005840 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lznmg\" (UniqueName: \"kubernetes.io/projected/65cc86a8-23ad-428c-a20a-e932e6cfebd1-kube-api-access-lznmg\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.005862 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jzb\" (UniqueName: \"kubernetes.io/projected/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-kube-api-access-k8jzb\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.005939 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.005960 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65cc86a8-23ad-428c-a20a-e932e6cfebd1-logs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.005983 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.006007 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-config-data\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.006212 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-public-tls-certs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.006304 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107609 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lznmg\" (UniqueName: \"kubernetes.io/projected/65cc86a8-23ad-428c-a20a-e932e6cfebd1-kube-api-access-lznmg\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107669 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jzb\" (UniqueName: \"kubernetes.io/projected/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-kube-api-access-k8jzb\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107753 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107776 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65cc86a8-23ad-428c-a20a-e932e6cfebd1-logs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107806 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107830 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-config-data\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107871 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-public-tls-certs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107898 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.107982 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-config-data\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.109400 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65cc86a8-23ad-428c-a20a-e932e6cfebd1-logs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.113867 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.115375 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.115484 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-public-tls-certs\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.121063 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-config-data\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.124072 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cc86a8-23ad-428c-a20a-e932e6cfebd1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.124800 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-config-data\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.128191 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lznmg\" (UniqueName: \"kubernetes.io/projected/65cc86a8-23ad-428c-a20a-e932e6cfebd1-kube-api-access-lznmg\") pod \"nova-api-0\" (UID: \"65cc86a8-23ad-428c-a20a-e932e6cfebd1\") " pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.129982 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jzb\" (UniqueName: \"kubernetes.io/projected/77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5-kube-api-access-k8jzb\") pod \"nova-scheduler-0\" (UID: \"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5\") " pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.167717 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.308503 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.654626 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a790ab0f-1ec3-486a-95bf-932d6c088c08","Type":"ContainerStarted","Data":"9a21bc3458ebf6f5d2583c3ec3d5f0a6ad390b01ab373b65faa0ee8a093b361c"} Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.655086 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a790ab0f-1ec3-486a-95bf-932d6c088c08","Type":"ContainerStarted","Data":"8eacfdcc4811005ccc3d85b9cde3ae9503157ffe1b093a662063892010bc2ea3"} Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.655101 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a790ab0f-1ec3-486a-95bf-932d6c088c08","Type":"ContainerStarted","Data":"ae0a81a548b26d13465912384ce21322e7eef8b3c313ac2d39618592ddab565c"} Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.698701 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.698679621 podStartE2EDuration="2.698679621s" podCreationTimestamp="2026-03-20 15:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:48:25.672132155 +0000 UTC m=+1522.634647975" watchObservedRunningTime="2026-03-20 15:48:25.698679621 +0000 UTC m=+1522.661195421" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.774809 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.823246 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e66aec5-6c2e-4935-9539-e5d3fc76ef5d" path="/var/lib/kubelet/pods/0e66aec5-6c2e-4935-9539-e5d3fc76ef5d/volumes" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.824610 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57169c3f-afba-425a-9c80-a7698c0151ce" path="/var/lib/kubelet/pods/57169c3f-afba-425a-9c80-a7698c0151ce/volumes" Mar 20 15:48:25 crc kubenswrapper[4779]: I0320 15:48:25.870215 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:48:25 crc kubenswrapper[4779]: W0320 15:48:25.873347 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77e9c453_fbe2_4ad3_a2eb_c11a21ed70e5.slice/crio-b223c4c6d750c8261e6ba9a3fc69ec5dfab1f928f3f466364cf9212919853cd2 WatchSource:0}: Error finding container b223c4c6d750c8261e6ba9a3fc69ec5dfab1f928f3f466364cf9212919853cd2: Status 404 returned error can't find the container with id b223c4c6d750c8261e6ba9a3fc69ec5dfab1f928f3f466364cf9212919853cd2 Mar 20 15:48:26 crc kubenswrapper[4779]: I0320 15:48:26.668613 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65cc86a8-23ad-428c-a20a-e932e6cfebd1","Type":"ContainerStarted","Data":"306b019b0197f93ca337cfe0f3a81d3cd7b13a6d2a4662fab224baef41355cac"} Mar 20 15:48:26 crc kubenswrapper[4779]: I0320 15:48:26.668653 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65cc86a8-23ad-428c-a20a-e932e6cfebd1","Type":"ContainerStarted","Data":"460c01bffd06e319485ca1326c601374b2a8ae2cda7ad1506d6fde2b06f3268d"} Mar 20 15:48:26 crc kubenswrapper[4779]: I0320 15:48:26.668664 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65cc86a8-23ad-428c-a20a-e932e6cfebd1","Type":"ContainerStarted","Data":"ebe3425ee41a07f16307d836626ccbef5b81b3c84afbb121d8f5565ab530b358"} Mar 20 15:48:26 crc kubenswrapper[4779]: I0320 15:48:26.680791 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5","Type":"ContainerStarted","Data":"6b081f57ab1b3d6c657e1223849d9774c4011df1659d8deaa5514a1bcaafb316"} Mar 20 15:48:26 crc kubenswrapper[4779]: I0320 15:48:26.681152 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5","Type":"ContainerStarted","Data":"b223c4c6d750c8261e6ba9a3fc69ec5dfab1f928f3f466364cf9212919853cd2"} Mar 20 15:48:26 crc kubenswrapper[4779]: I0320 15:48:26.691716 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.691696242 podStartE2EDuration="2.691696242s" podCreationTimestamp="2026-03-20 15:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:48:26.686752282 +0000 UTC m=+1523.649268102" watchObservedRunningTime="2026-03-20 15:48:26.691696242 +0000 UTC m=+1523.654212042" Mar 20 15:48:26 crc kubenswrapper[4779]: I0320 15:48:26.708504 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.70848376 podStartE2EDuration="2.70848376s" podCreationTimestamp="2026-03-20 15:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:48:26.700695652 +0000 UTC m=+1523.663211462" watchObservedRunningTime="2026-03-20 15:48:26.70848376 +0000 UTC m=+1523.670999570" Mar 20 15:48:30 crc kubenswrapper[4779]: I0320 15:48:30.309757 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 15:48:34 crc kubenswrapper[4779]: I0320 15:48:34.146477 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 15:48:34 crc kubenswrapper[4779]: I0320 15:48:34.147482 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 15:48:35 crc kubenswrapper[4779]: I0320 15:48:35.160343 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a790ab0f-1ec3-486a-95bf-932d6c088c08" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:48:35 crc kubenswrapper[4779]: I0320 15:48:35.160344 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a790ab0f-1ec3-486a-95bf-932d6c088c08" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:48:35 crc kubenswrapper[4779]: I0320 15:48:35.168024 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:48:35 crc kubenswrapper[4779]: I0320 15:48:35.168095 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:48:35 crc kubenswrapper[4779]: I0320 15:48:35.309081 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 15:48:35 crc kubenswrapper[4779]: I0320 15:48:35.335692 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 15:48:35 crc kubenswrapper[4779]: I0320 15:48:35.790428 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 15:48:36 crc kubenswrapper[4779]: I0320 15:48:36.186370 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65cc86a8-23ad-428c-a20a-e932e6cfebd1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:48:36 crc kubenswrapper[4779]: I0320 15:48:36.186462 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65cc86a8-23ad-428c-a20a-e932e6cfebd1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:48:40 crc kubenswrapper[4779]: I0320 15:48:40.019591 4779 scope.go:117] "RemoveContainer" containerID="398f8029790e97bdb52d60a3e2fa8e76826115ffbdf452916afd083a637188be" Mar 20 15:48:42 crc kubenswrapper[4779]: I0320 15:48:42.146750 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 15:48:42 crc kubenswrapper[4779]: I0320 15:48:42.147063 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 15:48:42 crc kubenswrapper[4779]: I0320 15:48:42.958645 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 15:48:43 crc kubenswrapper[4779]: I0320 15:48:43.168200 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 15:48:43 crc kubenswrapper[4779]: I0320 15:48:43.168253 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 15:48:44 crc kubenswrapper[4779]: I0320 15:48:44.151308 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 15:48:44 crc kubenswrapper[4779]: I0320 15:48:44.152675 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 15:48:44 crc kubenswrapper[4779]: I0320 15:48:44.154653 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 15:48:44 crc kubenswrapper[4779]: I0320 15:48:44.891767 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 15:48:45 crc kubenswrapper[4779]: I0320 15:48:45.173664 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 15:48:45 crc kubenswrapper[4779]: I0320 15:48:45.174244 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 15:48:45 crc kubenswrapper[4779]: I0320 15:48:45.184564 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 15:48:45 crc kubenswrapper[4779]: I0320 15:48:45.910327 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 15:48:46 crc kubenswrapper[4779]: I0320 15:48:46.803382 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:48:46 crc kubenswrapper[4779]: I0320 15:48:46.804100 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="49a61cbd-2e62-4e6a-8201-e4d0761885a7" containerName="kube-state-metrics" containerID="cri-o://49ae739bd70117adbae916a5cab791713c3f3b366468b63a172e726189c84e44" gracePeriod=30 Mar 20 15:48:47 crc kubenswrapper[4779]: E0320 15:48:47.185372 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a61cbd_2e62_4e6a_8201_e4d0761885a7.slice/crio-conmon-49ae739bd70117adbae916a5cab791713c3f3b366468b63a172e726189c84e44.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a61cbd_2e62_4e6a_8201_e4d0761885a7.slice/crio-49ae739bd70117adbae916a5cab791713c3f3b366468b63a172e726189c84e44.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:48:47 crc kubenswrapper[4779]: I0320 15:48:47.918359 4779 generic.go:334] "Generic (PLEG): container finished" podID="49a61cbd-2e62-4e6a-8201-e4d0761885a7" containerID="49ae739bd70117adbae916a5cab791713c3f3b366468b63a172e726189c84e44" exitCode=2 Mar 20 15:48:47 crc kubenswrapper[4779]: I0320 15:48:47.920299 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"49a61cbd-2e62-4e6a-8201-e4d0761885a7","Type":"ContainerDied","Data":"49ae739bd70117adbae916a5cab791713c3f3b366468b63a172e726189c84e44"} Mar 20 15:48:47 crc kubenswrapper[4779]: I0320 15:48:47.920340 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"49a61cbd-2e62-4e6a-8201-e4d0761885a7","Type":"ContainerDied","Data":"57cfcf5ea6ef366b8211f4a7daf6bbffa9cffd716637322653e9459fed943e02"} Mar 20 15:48:47 crc kubenswrapper[4779]: I0320 15:48:47.920359 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57cfcf5ea6ef366b8211f4a7daf6bbffa9cffd716637322653e9459fed943e02" Mar 20 15:48:47 crc kubenswrapper[4779]: I0320 15:48:47.962924 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.066262 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdzn2\" (UniqueName: \"kubernetes.io/projected/49a61cbd-2e62-4e6a-8201-e4d0761885a7-kube-api-access-rdzn2\") pod \"49a61cbd-2e62-4e6a-8201-e4d0761885a7\" (UID: \"49a61cbd-2e62-4e6a-8201-e4d0761885a7\") " Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.073857 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a61cbd-2e62-4e6a-8201-e4d0761885a7-kube-api-access-rdzn2" (OuterVolumeSpecName: "kube-api-access-rdzn2") pod "49a61cbd-2e62-4e6a-8201-e4d0761885a7" (UID: "49a61cbd-2e62-4e6a-8201-e4d0761885a7"). InnerVolumeSpecName "kube-api-access-rdzn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.169184 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdzn2\" (UniqueName: \"kubernetes.io/projected/49a61cbd-2e62-4e6a-8201-e4d0761885a7-kube-api-access-rdzn2\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.859650 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.860345 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="proxy-httpd" containerID="cri-o://edc89d1b74ad2b786402beda3335225dcb2994d707bb64bf54fa87bdac037226" gracePeriod=30 Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.860331 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="ceilometer-central-agent" containerID="cri-o://460b3950fe54e1bf7b52711deec65bf831905d913a21ac9016c3781222c4138c" gracePeriod=30 Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.860386 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="sg-core" containerID="cri-o://c21a7a9c12967adc9cfc407de35cfb107a6a5acf6360931bef29e38c8d8ae78f" gracePeriod=30 Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.860477 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="ceilometer-notification-agent" containerID="cri-o://045f64d9d747c935724738fb6e73cd70b20734787db2e7f85c3e31c75ff95c45" gracePeriod=30 Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.926888 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.960162 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:48:48 crc kubenswrapper[4779]: I0320 15:48:48.988451 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.001309 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:48:49 crc kubenswrapper[4779]: E0320 15:48:49.001872 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a61cbd-2e62-4e6a-8201-e4d0761885a7" containerName="kube-state-metrics" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.001890 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a61cbd-2e62-4e6a-8201-e4d0761885a7" containerName="kube-state-metrics" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.002132 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a61cbd-2e62-4e6a-8201-e4d0761885a7" containerName="kube-state-metrics" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.002978 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.005640 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.006374 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.009006 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.087791 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.088032 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxnl6\" (UniqueName: \"kubernetes.io/projected/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-api-access-rxnl6\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.088231 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.088281 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.190374 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.190584 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxnl6\" (UniqueName: \"kubernetes.io/projected/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-api-access-rxnl6\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.190729 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.190759 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.196997 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.197160 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.197580 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.205529 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxnl6\" (UniqueName: \"kubernetes.io/projected/6a048aaf-5953-4a6d-8aa3-631fe9ea027b-kube-api-access-rxnl6\") pod \"kube-state-metrics-0\" (UID: \"6a048aaf-5953-4a6d-8aa3-631fe9ea027b\") " pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.373290 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.820993 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a61cbd-2e62-4e6a-8201-e4d0761885a7" path="/var/lib/kubelet/pods/49a61cbd-2e62-4e6a-8201-e4d0761885a7/volumes" Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.939461 4779 generic.go:334] "Generic (PLEG): container finished" podID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerID="edc89d1b74ad2b786402beda3335225dcb2994d707bb64bf54fa87bdac037226" exitCode=0 Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.939491 4779 generic.go:334] "Generic (PLEG): container finished" podID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerID="c21a7a9c12967adc9cfc407de35cfb107a6a5acf6360931bef29e38c8d8ae78f" exitCode=2 Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.939502 4779 generic.go:334] "Generic (PLEG): container finished" podID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerID="460b3950fe54e1bf7b52711deec65bf831905d913a21ac9016c3781222c4138c" exitCode=0 Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.939522 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerDied","Data":"edc89d1b74ad2b786402beda3335225dcb2994d707bb64bf54fa87bdac037226"} Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.939548 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerDied","Data":"c21a7a9c12967adc9cfc407de35cfb107a6a5acf6360931bef29e38c8d8ae78f"} Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.939559 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerDied","Data":"460b3950fe54e1bf7b52711deec65bf831905d913a21ac9016c3781222c4138c"} Mar 20 15:48:49 crc kubenswrapper[4779]: I0320 15:48:49.943806 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:48:50 crc kubenswrapper[4779]: I0320 15:48:50.958687 4779 generic.go:334] "Generic (PLEG): container finished" podID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerID="045f64d9d747c935724738fb6e73cd70b20734787db2e7f85c3e31c75ff95c45" exitCode=0 Mar 20 15:48:50 crc kubenswrapper[4779]: I0320 15:48:50.958786 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerDied","Data":"045f64d9d747c935724738fb6e73cd70b20734787db2e7f85c3e31c75ff95c45"} Mar 20 15:48:50 crc kubenswrapper[4779]: I0320 15:48:50.962906 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6a048aaf-5953-4a6d-8aa3-631fe9ea027b","Type":"ContainerStarted","Data":"213b2b111be57c39297703e4185046edc8d9dde80f42645abc8578f1887fa978"} Mar 20 15:48:50 crc kubenswrapper[4779]: I0320 15:48:50.962971 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6a048aaf-5953-4a6d-8aa3-631fe9ea027b","Type":"ContainerStarted","Data":"35f6930ef1486a8b28031e277cf38995310db529e7da36ebf190dffd04962ab3"} Mar 20 15:48:50 crc kubenswrapper[4779]: I0320 15:48:50.963925 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 15:48:50 crc kubenswrapper[4779]: I0320 15:48:50.990007 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.619049033 podStartE2EDuration="2.989989715s" podCreationTimestamp="2026-03-20 15:48:48 +0000 UTC" firstStartedPulling="2026-03-20 15:48:49.951337294 +0000 UTC m=+1546.913853094" lastFinishedPulling="2026-03-20 15:48:50.322277976 +0000 UTC m=+1547.284793776" observedRunningTime="2026-03-20 15:48:50.988355585 +0000 UTC m=+1547.950871395" watchObservedRunningTime="2026-03-20 15:48:50.989989715 +0000 UTC m=+1547.952505515" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.004350 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.025820 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-run-httpd\") pod \"87fed41c-5259-4a97-9a57-72cab891f2c6\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.025864 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-config-data\") pod \"87fed41c-5259-4a97-9a57-72cab891f2c6\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.025894 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-combined-ca-bundle\") pod \"87fed41c-5259-4a97-9a57-72cab891f2c6\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.025985 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-scripts\") pod \"87fed41c-5259-4a97-9a57-72cab891f2c6\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.026012 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzsf6\" (UniqueName: \"kubernetes.io/projected/87fed41c-5259-4a97-9a57-72cab891f2c6-kube-api-access-tzsf6\") pod \"87fed41c-5259-4a97-9a57-72cab891f2c6\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.026179 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-log-httpd\") pod \"87fed41c-5259-4a97-9a57-72cab891f2c6\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.026214 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-sg-core-conf-yaml\") pod \"87fed41c-5259-4a97-9a57-72cab891f2c6\" (UID: \"87fed41c-5259-4a97-9a57-72cab891f2c6\") " Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.029795 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87fed41c-5259-4a97-9a57-72cab891f2c6" (UID: "87fed41c-5259-4a97-9a57-72cab891f2c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.031541 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87fed41c-5259-4a97-9a57-72cab891f2c6" (UID: "87fed41c-5259-4a97-9a57-72cab891f2c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.037085 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87fed41c-5259-4a97-9a57-72cab891f2c6-kube-api-access-tzsf6" (OuterVolumeSpecName: "kube-api-access-tzsf6") pod "87fed41c-5259-4a97-9a57-72cab891f2c6" (UID: "87fed41c-5259-4a97-9a57-72cab891f2c6"). InnerVolumeSpecName "kube-api-access-tzsf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.038238 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-scripts" (OuterVolumeSpecName: "scripts") pod "87fed41c-5259-4a97-9a57-72cab891f2c6" (UID: "87fed41c-5259-4a97-9a57-72cab891f2c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.073381 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87fed41c-5259-4a97-9a57-72cab891f2c6" (UID: "87fed41c-5259-4a97-9a57-72cab891f2c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.123147 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87fed41c-5259-4a97-9a57-72cab891f2c6" (UID: "87fed41c-5259-4a97-9a57-72cab891f2c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.128978 4779 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.129009 4779 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.129020 4779 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87fed41c-5259-4a97-9a57-72cab891f2c6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.129042 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.129051 4779 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.129095 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzsf6\" (UniqueName: \"kubernetes.io/projected/87fed41c-5259-4a97-9a57-72cab891f2c6-kube-api-access-tzsf6\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.171419 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-config-data" (OuterVolumeSpecName: "config-data") pod "87fed41c-5259-4a97-9a57-72cab891f2c6" (UID: "87fed41c-5259-4a97-9a57-72cab891f2c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.230887 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87fed41c-5259-4a97-9a57-72cab891f2c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.976197 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87fed41c-5259-4a97-9a57-72cab891f2c6","Type":"ContainerDied","Data":"9a7f3d91c4b5509984cf04a79760b0e26020d4b7dd77471b8dbe1327c2ac8fb8"} Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.976242 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:51 crc kubenswrapper[4779]: I0320 15:48:51.976274 4779 scope.go:117] "RemoveContainer" containerID="edc89d1b74ad2b786402beda3335225dcb2994d707bb64bf54fa87bdac037226" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.002360 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.020984 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.026674 4779 scope.go:117] "RemoveContainer" containerID="c21a7a9c12967adc9cfc407de35cfb107a6a5acf6360931bef29e38c8d8ae78f" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.036244 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:52 crc kubenswrapper[4779]: E0320 15:48:52.036687 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="ceilometer-central-agent" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.036710 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="ceilometer-central-agent" Mar 20 15:48:52 crc kubenswrapper[4779]: E0320 15:48:52.036728 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="sg-core" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.036738 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="sg-core" Mar 20 15:48:52 crc kubenswrapper[4779]: E0320 15:48:52.036770 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="ceilometer-notification-agent" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.036779 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="ceilometer-notification-agent" Mar 20 15:48:52 crc kubenswrapper[4779]: E0320 15:48:52.036805 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="proxy-httpd" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.036813 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="proxy-httpd" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.037056 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="ceilometer-notification-agent" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.037087 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="ceilometer-central-agent" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.037124 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="proxy-httpd" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.037141 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" containerName="sg-core" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.039337 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.042672 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.042911 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.042949 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.057232 4779 scope.go:117] "RemoveContainer" containerID="045f64d9d747c935724738fb6e73cd70b20734787db2e7f85c3e31c75ff95c45" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.061852 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.100121 4779 scope.go:117] "RemoveContainer" containerID="460b3950fe54e1bf7b52711deec65bf831905d913a21ac9016c3781222c4138c" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.151350 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6d5528f-e060-40ac-91c7-ac53aef84cb5-log-httpd\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.151691 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.151850 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6d5528f-e060-40ac-91c7-ac53aef84cb5-run-httpd\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.151987 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-scripts\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.152146 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.152342 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5pfp\" (UniqueName: \"kubernetes.io/projected/f6d5528f-e060-40ac-91c7-ac53aef84cb5-kube-api-access-j5pfp\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.152486 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-config-data\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.152614 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.254471 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6d5528f-e060-40ac-91c7-ac53aef84cb5-run-httpd\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.254530 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-scripts\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.254569 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.254587 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5pfp\" (UniqueName: \"kubernetes.io/projected/f6d5528f-e060-40ac-91c7-ac53aef84cb5-kube-api-access-j5pfp\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.254620 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-config-data\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.254638 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.254672 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6d5528f-e060-40ac-91c7-ac53aef84cb5-log-httpd\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.254729 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.255847 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6d5528f-e060-40ac-91c7-ac53aef84cb5-log-httpd\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.256300 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6d5528f-e060-40ac-91c7-ac53aef84cb5-run-httpd\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.259375 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.259989 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.260081 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-config-data\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.260883 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-scripts\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.262708 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d5528f-e060-40ac-91c7-ac53aef84cb5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.288710 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5pfp\" (UniqueName: \"kubernetes.io/projected/f6d5528f-e060-40ac-91c7-ac53aef84cb5-kube-api-access-j5pfp\") pod \"ceilometer-0\" (UID: \"f6d5528f-e060-40ac-91c7-ac53aef84cb5\") " pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.371156 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.896542 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:48:52 crc kubenswrapper[4779]: W0320 15:48:52.908079 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d5528f_e060_40ac_91c7_ac53aef84cb5.slice/crio-7ff1c6d5829368ed31b8312a695de0a8cdaf4fb95a95ac565c674ca6ea2113c6 WatchSource:0}: Error finding container 7ff1c6d5829368ed31b8312a695de0a8cdaf4fb95a95ac565c674ca6ea2113c6: Status 404 returned error can't find the container with id 7ff1c6d5829368ed31b8312a695de0a8cdaf4fb95a95ac565c674ca6ea2113c6 Mar 20 15:48:52 crc kubenswrapper[4779]: I0320 15:48:52.987250 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6d5528f-e060-40ac-91c7-ac53aef84cb5","Type":"ContainerStarted","Data":"7ff1c6d5829368ed31b8312a695de0a8cdaf4fb95a95ac565c674ca6ea2113c6"} Mar 20 15:48:53 crc kubenswrapper[4779]: I0320 15:48:53.819913 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87fed41c-5259-4a97-9a57-72cab891f2c6" path="/var/lib/kubelet/pods/87fed41c-5259-4a97-9a57-72cab891f2c6/volumes" Mar 20 15:48:53 crc kubenswrapper[4779]: I0320 15:48:53.997527 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6d5528f-e060-40ac-91c7-ac53aef84cb5","Type":"ContainerStarted","Data":"279f27a789b6b5346fcc818427732fd9cd5ae618e7f9f2aa49e5633946f9448f"} Mar 20 15:48:54 crc kubenswrapper[4779]: I0320 15:48:54.400755 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:48:54 crc kubenswrapper[4779]: I0320 15:48:54.525243 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:48:55 crc kubenswrapper[4779]: I0320 15:48:55.010018 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6d5528f-e060-40ac-91c7-ac53aef84cb5","Type":"ContainerStarted","Data":"6d73306b47a4d1e88dbe7293b2780751648f5078d0e1c13814fa87294032d1ee"} Mar 20 15:48:56 crc kubenswrapper[4779]: I0320 15:48:56.024490 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6d5528f-e060-40ac-91c7-ac53aef84cb5","Type":"ContainerStarted","Data":"673dea264c94c989735943662ca1272138b88f6c5d07d497a806d81b9d76d7d2"} Mar 20 15:48:58 crc kubenswrapper[4779]: I0320 15:48:58.045875 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6d5528f-e060-40ac-91c7-ac53aef84cb5","Type":"ContainerStarted","Data":"2e92fda4b9cb0d079c0522eae0a1d214040e0bb4e88cfcd628aba1d125511faa"} Mar 20 15:48:58 crc kubenswrapper[4779]: I0320 15:48:58.046203 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:48:58 crc kubenswrapper[4779]: I0320 15:48:58.071705 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.499757376 podStartE2EDuration="6.071687353s" podCreationTimestamp="2026-03-20 15:48:52 +0000 UTC" firstStartedPulling="2026-03-20 15:48:52.913794175 +0000 UTC m=+1549.876309975" lastFinishedPulling="2026-03-20 15:48:57.485724152 +0000 UTC m=+1554.448239952" observedRunningTime="2026-03-20 15:48:58.063883544 +0000 UTC m=+1555.026399344" watchObservedRunningTime="2026-03-20 15:48:58.071687353 +0000 UTC m=+1555.034203153" Mar 20 15:48:59 crc kubenswrapper[4779]: I0320 15:48:59.069180 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" containerName="rabbitmq" containerID="cri-o://b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465" gracePeriod=604796 Mar 20 15:48:59 crc kubenswrapper[4779]: I0320 15:48:59.132022 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" containerName="rabbitmq" containerID="cri-o://8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817" gracePeriod=604796 Mar 20 15:48:59 crc kubenswrapper[4779]: I0320 15:48:59.390530 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 15:49:05 crc kubenswrapper[4779]: E0320 15:49:05.582497 4779 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.771008 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.776274 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.931798 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3016debc-9603-417f-8ff1-6fd3934cd17e-erlang-cookie-secret\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932244 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-erlang-cookie\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932299 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3016debc-9603-417f-8ff1-6fd3934cd17e-pod-info\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932330 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-confd\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932360 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-plugins-conf\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932382 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74af0a5-e3c3-4569-bece-db2e25e9b79d-erlang-cookie-secret\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932431 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932462 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-config-data\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932524 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-server-conf\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932566 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74af0a5-e3c3-4569-bece-db2e25e9b79d-pod-info\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932596 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-server-conf\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932620 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-confd\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932640 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-tls\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932665 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-erlang-cookie\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932699 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932739 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flbbj\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-kube-api-access-flbbj\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932764 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-plugins\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932789 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-config-data\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932845 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-tls\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932885 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx7n5\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-kube-api-access-nx7n5\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932930 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-plugins\") pod \"3016debc-9603-417f-8ff1-6fd3934cd17e\" (UID: \"3016debc-9603-417f-8ff1-6fd3934cd17e\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.932965 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-plugins-conf\") pod \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\" (UID: \"f74af0a5-e3c3-4569-bece-db2e25e9b79d\") " Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.935532 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.937865 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.939052 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.941417 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3016debc-9603-417f-8ff1-6fd3934cd17e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.942011 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.942774 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-kube-api-access-nx7n5" (OuterVolumeSpecName: "kube-api-access-nx7n5") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "kube-api-access-nx7n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.943357 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.944206 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.944609 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.946685 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-kube-api-access-flbbj" (OuterVolumeSpecName: "kube-api-access-flbbj") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "kube-api-access-flbbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.946717 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.950416 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74af0a5-e3c3-4569-bece-db2e25e9b79d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.959275 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.960605 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f74af0a5-e3c3-4569-bece-db2e25e9b79d-pod-info" (OuterVolumeSpecName: "pod-info") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.961669 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3016debc-9603-417f-8ff1-6fd3934cd17e-pod-info" (OuterVolumeSpecName: "pod-info") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.966294 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.982213 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-config-data" (OuterVolumeSpecName: "config-data") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:05 crc kubenswrapper[4779]: I0320 15:49:05.995012 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-config-data" (OuterVolumeSpecName: "config-data") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.027971 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-server-conf" (OuterVolumeSpecName: "server-conf") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.036077 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-server-conf" (OuterVolumeSpecName: "server-conf") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037472 4779 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3016debc-9603-417f-8ff1-6fd3934cd17e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037492 4779 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037501 4779 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74af0a5-e3c3-4569-bece-db2e25e9b79d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037527 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037536 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037546 4779 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037554 4779 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74af0a5-e3c3-4569-bece-db2e25e9b79d-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037561 4779 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037570 4779 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037578 4779 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037590 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037599 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flbbj\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-kube-api-access-flbbj\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037608 4779 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037616 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3016debc-9603-417f-8ff1-6fd3934cd17e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037623 4779 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037631 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx7n5\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-kube-api-access-nx7n5\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037639 4779 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037646 4779 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74af0a5-e3c3-4569-bece-db2e25e9b79d-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037656 4779 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3016debc-9603-417f-8ff1-6fd3934cd17e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.037666 4779 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.070639 4779 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.078506 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f74af0a5-e3c3-4569-bece-db2e25e9b79d" (UID: "f74af0a5-e3c3-4569-bece-db2e25e9b79d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.097696 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3016debc-9603-417f-8ff1-6fd3934cd17e" (UID: "3016debc-9603-417f-8ff1-6fd3934cd17e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.101133 4779 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.131919 4779 generic.go:334] "Generic (PLEG): container finished" podID="3016debc-9603-417f-8ff1-6fd3934cd17e" containerID="8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817" exitCode=0 Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.131960 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.131990 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3016debc-9603-417f-8ff1-6fd3934cd17e","Type":"ContainerDied","Data":"8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817"} Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.132064 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3016debc-9603-417f-8ff1-6fd3934cd17e","Type":"ContainerDied","Data":"4e62fbafb643ea8b3a0486879797d57107a30191f6fbbcbcf6812570aaf83e42"} Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.132120 4779 scope.go:117] "RemoveContainer" containerID="8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.135304 4779 generic.go:334] "Generic (PLEG): container finished" podID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" containerID="b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465" exitCode=0 Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.135346 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74af0a5-e3c3-4569-bece-db2e25e9b79d","Type":"ContainerDied","Data":"b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465"} Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.135372 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74af0a5-e3c3-4569-bece-db2e25e9b79d","Type":"ContainerDied","Data":"27cd622489a45c7dd259e27831a13dbd7bf4ffa9c19ccda9387d040940bd939c"} Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.135376 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.145697 4779 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74af0a5-e3c3-4569-bece-db2e25e9b79d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.145730 4779 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.145768 4779 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3016debc-9603-417f-8ff1-6fd3934cd17e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.145782 4779 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.171786 4779 scope.go:117] "RemoveContainer" containerID="bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.215330 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.268343 4779 scope.go:117] "RemoveContainer" containerID="8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817" Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.275765 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817\": container with ID starting with 8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817 not found: ID does not exist" containerID="8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.275823 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817"} err="failed to get container status \"8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817\": rpc error: code = NotFound desc = could not find container \"8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817\": container with ID starting with 8eb30a1fc4925729d4962c90b33f01b59c8d4f9bf62ea53ba76d5ee745ddc817 not found: ID does not exist" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.275863 4779 scope.go:117] "RemoveContainer" containerID="bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.275954 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.277233 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5\": container with ID starting with bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5 not found: ID does not exist" containerID="bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.277276 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5"} err="failed to get container status \"bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5\": rpc error: code = NotFound desc = could not find container \"bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5\": container with ID starting with bc07a7145cf026d7ad7cfc471274cba8f63e2f29f37bd01537575ed3f1d605c5 not found: ID does not exist" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.277302 4779 scope.go:117] "RemoveContainer" containerID="b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.314322 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.314721 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" containerName="setup-container" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.314737 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" containerName="setup-container" Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.314758 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" containerName="rabbitmq" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.314765 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" containerName="rabbitmq" Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.314795 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" containerName="rabbitmq" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.314801 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" containerName="rabbitmq" Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.314812 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" containerName="setup-container" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.314819 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" containerName="setup-container" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.314990 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" containerName="rabbitmq" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.315000 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" containerName="rabbitmq" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.318404 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.322619 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.322760 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.322932 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.323042 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.323381 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.323593 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5wld4" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.323713 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.340551 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.353925 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.365948 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.378325 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.380169 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.386317 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.386519 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.386632 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.386895 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.387046 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.387189 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7p9wv" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.387730 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.387834 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.409283 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-rgbp2"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.410881 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.416213 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-rgbp2"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.420839 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.441302 4779 scope.go:117] "RemoveContainer" containerID="3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.461954 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462004 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462038 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462130 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a91eab41-69fe-44e7-a239-6956a6b18dd8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462166 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a91eab41-69fe-44e7-a239-6956a6b18dd8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462188 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462213 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462233 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94ld8\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-kube-api-access-94ld8\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462282 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462306 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.462367 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.516363 4779 scope.go:117] "RemoveContainer" containerID="b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465" Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.517812 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465\": container with ID starting with b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465 not found: ID does not exist" containerID="b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.517854 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465"} err="failed to get container status \"b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465\": rpc error: code = NotFound desc = could not find container \"b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465\": container with ID starting with b374aae4b5a56f491bf3eec7588561fcc2601539dc63aefc7fa278f84ba92465 not found: ID does not exist" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.517880 4779 scope.go:117] "RemoveContainer" containerID="3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc" Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.518874 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc\": container with ID starting with 3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc not found: ID does not exist" containerID="3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.518900 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc"} err="failed to get container status \"3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc\": rpc error: code = NotFound desc = could not find container \"3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc\": container with ID starting with 3c579d0a88bc2e1597d50784bc3a129d5a5fa3b225de05abb1902d00d8e6ecfc not found: ID does not exist" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.542382 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-rgbp2"] Mar 20 15:49:06 crc kubenswrapper[4779]: E0320 15:49:06.543475 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-l8kkh openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" podUID="62b695a9-e815-4771-8dee-2d3ad2a98628" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564220 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564269 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564318 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564339 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564364 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9bba48d4-65a1-44b2-b750-6a7f27613e63-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564382 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564403 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564423 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564460 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-svc\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564485 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-config-data\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564507 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a91eab41-69fe-44e7-a239-6956a6b18dd8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564542 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a91eab41-69fe-44e7-a239-6956a6b18dd8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564566 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564585 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564604 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94ld8\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-kube-api-access-94ld8\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564625 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8kkh\" (UniqueName: \"kubernetes.io/projected/62b695a9-e815-4771-8dee-2d3ad2a98628-kube-api-access-l8kkh\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564642 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564661 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9bba48d4-65a1-44b2-b750-6a7f27613e63-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564677 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564698 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564713 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564730 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95r6k\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-kube-api-access-95r6k\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564750 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564771 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564794 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-config\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564821 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564846 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564878 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.564899 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.566594 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.566931 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.568091 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.568752 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.569452 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.570562 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a91eab41-69fe-44e7-a239-6956a6b18dd8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.573067 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.581656 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a91eab41-69fe-44e7-a239-6956a6b18dd8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.582155 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.584365 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ld8\" (UniqueName: \"kubernetes.io/projected/a91eab41-69fe-44e7-a239-6956a6b18dd8-kube-api-access-94ld8\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.586675 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a91eab41-69fe-44e7-a239-6956a6b18dd8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.607611 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-44qdw"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.609720 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.640096 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-44qdw"] Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.641813 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a91eab41-69fe-44e7-a239-6956a6b18dd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668311 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668405 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9bba48d4-65a1-44b2-b750-6a7f27613e63-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668471 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668556 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-svc\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668588 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-config-data\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668708 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8kkh\" (UniqueName: \"kubernetes.io/projected/62b695a9-e815-4771-8dee-2d3ad2a98628-kube-api-access-l8kkh\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668894 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668922 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9bba48d4-65a1-44b2-b750-6a7f27613e63-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668940 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.668997 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669015 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669062 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95r6k\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-kube-api-access-95r6k\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669148 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-config\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669235 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669257 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669324 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669402 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669429 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669448 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.669970 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.670444 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-config-data\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.670599 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.671164 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.671836 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.671868 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9bba48d4-65a1-44b2-b750-6a7f27613e63-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.671930 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-config\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.671957 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.672862 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.673309 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.674948 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.674974 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9bba48d4-65a1-44b2-b750-6a7f27613e63-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.676181 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.676422 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-svc\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.677709 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9bba48d4-65a1-44b2-b750-6a7f27613e63-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.686332 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95r6k\" (UniqueName: \"kubernetes.io/projected/9bba48d4-65a1-44b2-b750-6a7f27613e63-kube-api-access-95r6k\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.687157 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8kkh\" (UniqueName: \"kubernetes.io/projected/62b695a9-e815-4771-8dee-2d3ad2a98628-kube-api-access-l8kkh\") pod \"dnsmasq-dns-67b789f86c-rgbp2\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.698672 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.714732 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"9bba48d4-65a1-44b2-b750-6a7f27613e63\") " pod="openstack/rabbitmq-server-0" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.774388 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.774443 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-config\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.774538 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.774571 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.774640 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.774678 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.774713 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g82gt\" (UniqueName: \"kubernetes.io/projected/b3c10bd0-d7e5-42cb-8609-4a7692300f37-kube-api-access-g82gt\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.877610 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.877928 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-config\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.878011 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.878043 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.878096 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.878156 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.878184 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g82gt\" (UniqueName: \"kubernetes.io/projected/b3c10bd0-d7e5-42cb-8609-4a7692300f37-kube-api-access-g82gt\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.879194 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.879833 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-config\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.880727 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.881251 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.882314 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.882438 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b3c10bd0-d7e5-42cb-8609-4a7692300f37-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:06 crc kubenswrapper[4779]: I0320 15:49:06.896441 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g82gt\" (UniqueName: \"kubernetes.io/projected/b3c10bd0-d7e5-42cb-8609-4a7692300f37-kube-api-access-g82gt\") pod \"dnsmasq-dns-7fd9f947b7-44qdw\" (UID: \"b3c10bd0-d7e5-42cb-8609-4a7692300f37\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.009641 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.037543 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.170036 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.173843 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:49:07 crc kubenswrapper[4779]: W0320 15:49:07.176995 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda91eab41_69fe_44e7_a239_6956a6b18dd8.slice/crio-074c3873de5d89024f58e82e222d201aa70ca04c85c9ccf79645d3512ac3ee65 WatchSource:0}: Error finding container 074c3873de5d89024f58e82e222d201aa70ca04c85c9ccf79645d3512ac3ee65: Status 404 returned error can't find the container with id 074c3873de5d89024f58e82e222d201aa70ca04c85c9ccf79645d3512ac3ee65 Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.183315 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.285249 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-svc\") pod \"62b695a9-e815-4771-8dee-2d3ad2a98628\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.285322 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8kkh\" (UniqueName: \"kubernetes.io/projected/62b695a9-e815-4771-8dee-2d3ad2a98628-kube-api-access-l8kkh\") pod \"62b695a9-e815-4771-8dee-2d3ad2a98628\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.285350 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-sb\") pod \"62b695a9-e815-4771-8dee-2d3ad2a98628\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.285398 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-nb\") pod \"62b695a9-e815-4771-8dee-2d3ad2a98628\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.285746 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-openstack-edpm-ipam\") pod \"62b695a9-e815-4771-8dee-2d3ad2a98628\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.285857 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62b695a9-e815-4771-8dee-2d3ad2a98628" (UID: "62b695a9-e815-4771-8dee-2d3ad2a98628"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.286055 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62b695a9-e815-4771-8dee-2d3ad2a98628" (UID: "62b695a9-e815-4771-8dee-2d3ad2a98628"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.287083 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-swift-storage-0\") pod \"62b695a9-e815-4771-8dee-2d3ad2a98628\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.287227 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-config\") pod \"62b695a9-e815-4771-8dee-2d3ad2a98628\" (UID: \"62b695a9-e815-4771-8dee-2d3ad2a98628\") " Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.286201 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "62b695a9-e815-4771-8dee-2d3ad2a98628" (UID: "62b695a9-e815-4771-8dee-2d3ad2a98628"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.286970 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62b695a9-e815-4771-8dee-2d3ad2a98628" (UID: "62b695a9-e815-4771-8dee-2d3ad2a98628"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.287541 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "62b695a9-e815-4771-8dee-2d3ad2a98628" (UID: "62b695a9-e815-4771-8dee-2d3ad2a98628"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.288313 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-config" (OuterVolumeSpecName: "config") pod "62b695a9-e815-4771-8dee-2d3ad2a98628" (UID: "62b695a9-e815-4771-8dee-2d3ad2a98628"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.289157 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b695a9-e815-4771-8dee-2d3ad2a98628-kube-api-access-l8kkh" (OuterVolumeSpecName: "kube-api-access-l8kkh") pod "62b695a9-e815-4771-8dee-2d3ad2a98628" (UID: "62b695a9-e815-4771-8dee-2d3ad2a98628"). InnerVolumeSpecName "kube-api-access-l8kkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.289288 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.289304 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.289314 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.289326 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.289336 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8kkh\" (UniqueName: \"kubernetes.io/projected/62b695a9-e815-4771-8dee-2d3ad2a98628-kube-api-access-l8kkh\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.289347 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.289357 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62b695a9-e815-4771-8dee-2d3ad2a98628-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.658849 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.704086 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-44qdw"] Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.821871 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3016debc-9603-417f-8ff1-6fd3934cd17e" path="/var/lib/kubelet/pods/3016debc-9603-417f-8ff1-6fd3934cd17e/volumes" Mar 20 15:49:07 crc kubenswrapper[4779]: I0320 15:49:07.826784 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74af0a5-e3c3-4569-bece-db2e25e9b79d" path="/var/lib/kubelet/pods/f74af0a5-e3c3-4569-bece-db2e25e9b79d/volumes" Mar 20 15:49:08 crc kubenswrapper[4779]: I0320 15:49:08.180656 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9bba48d4-65a1-44b2-b750-6a7f27613e63","Type":"ContainerStarted","Data":"ae50c0f4e2f538488130a618c5d41157c1c6523c929f3c8e11146e5ddf50cca5"} Mar 20 15:49:08 crc kubenswrapper[4779]: I0320 15:49:08.182327 4779 generic.go:334] "Generic (PLEG): container finished" podID="b3c10bd0-d7e5-42cb-8609-4a7692300f37" containerID="c1cbed1d0e51a815f00c652b4bf729269cbd1b63dc7116d184d82afe3a391398" exitCode=0 Mar 20 15:49:08 crc kubenswrapper[4779]: I0320 15:49:08.182434 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" event={"ID":"b3c10bd0-d7e5-42cb-8609-4a7692300f37","Type":"ContainerDied","Data":"c1cbed1d0e51a815f00c652b4bf729269cbd1b63dc7116d184d82afe3a391398"} Mar 20 15:49:08 crc kubenswrapper[4779]: I0320 15:49:08.182501 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" event={"ID":"b3c10bd0-d7e5-42cb-8609-4a7692300f37","Type":"ContainerStarted","Data":"70cb29f4de8eb74883c3b62f9a783e296a1c7fabdefdac6cc3a343bf05bb8982"} Mar 20 15:49:08 crc kubenswrapper[4779]: I0320 15:49:08.185342 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-rgbp2" Mar 20 15:49:08 crc kubenswrapper[4779]: I0320 15:49:08.185331 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a91eab41-69fe-44e7-a239-6956a6b18dd8","Type":"ContainerStarted","Data":"074c3873de5d89024f58e82e222d201aa70ca04c85c9ccf79645d3512ac3ee65"} Mar 20 15:49:08 crc kubenswrapper[4779]: I0320 15:49:08.243670 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-rgbp2"] Mar 20 15:49:08 crc kubenswrapper[4779]: I0320 15:49:08.254032 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-rgbp2"] Mar 20 15:49:09 crc kubenswrapper[4779]: I0320 15:49:09.195941 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" event={"ID":"b3c10bd0-d7e5-42cb-8609-4a7692300f37","Type":"ContainerStarted","Data":"09aaf07d960b8d1b2f3fb79b5cd297f3a5519a871edc32cc64b5381bb4cde313"} Mar 20 15:49:09 crc kubenswrapper[4779]: I0320 15:49:09.196381 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:09 crc kubenswrapper[4779]: I0320 15:49:09.198497 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a91eab41-69fe-44e7-a239-6956a6b18dd8","Type":"ContainerStarted","Data":"a4783c5bb903e2a362edcb88cc45882c445d58fd5b62a92b02ffa2caab7d13bb"} Mar 20 15:49:09 crc kubenswrapper[4779]: I0320 15:49:09.213193 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" podStartSLOduration=3.213178382 podStartE2EDuration="3.213178382s" podCreationTimestamp="2026-03-20 15:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:49:09.210352453 +0000 UTC m=+1566.172868263" watchObservedRunningTime="2026-03-20 15:49:09.213178382 +0000 UTC m=+1566.175694182" Mar 20 15:49:09 crc kubenswrapper[4779]: I0320 15:49:09.821429 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b695a9-e815-4771-8dee-2d3ad2a98628" path="/var/lib/kubelet/pods/62b695a9-e815-4771-8dee-2d3ad2a98628/volumes" Mar 20 15:49:10 crc kubenswrapper[4779]: I0320 15:49:10.208724 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9bba48d4-65a1-44b2-b750-6a7f27613e63","Type":"ContainerStarted","Data":"cb7e0778cefa2126238da17c687240e69ad8423ce705113da7e5d3ab058b75c1"} Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.039322 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd9f947b7-44qdw" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.103413 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l62ft"] Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.103643 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" podUID="faefbbf5-3a2e-4938-821d-c5737c066de5" containerName="dnsmasq-dns" containerID="cri-o://37ee607ade32a3a551d4da84eb72ab44b0a30e0ebbd98cf77c258470d9127201" gracePeriod=10 Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.279390 4779 generic.go:334] "Generic (PLEG): container finished" podID="faefbbf5-3a2e-4938-821d-c5737c066de5" containerID="37ee607ade32a3a551d4da84eb72ab44b0a30e0ebbd98cf77c258470d9127201" exitCode=0 Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.279469 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" event={"ID":"faefbbf5-3a2e-4938-821d-c5737c066de5","Type":"ContainerDied","Data":"37ee607ade32a3a551d4da84eb72ab44b0a30e0ebbd98cf77c258470d9127201"} Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.606896 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.704706 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-svc\") pod \"faefbbf5-3a2e-4938-821d-c5737c066de5\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.704829 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-config\") pod \"faefbbf5-3a2e-4938-821d-c5737c066de5\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.704850 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-nb\") pod \"faefbbf5-3a2e-4938-821d-c5737c066de5\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.704893 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-sb\") pod \"faefbbf5-3a2e-4938-821d-c5737c066de5\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.705040 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj2tr\" (UniqueName: \"kubernetes.io/projected/faefbbf5-3a2e-4938-821d-c5737c066de5-kube-api-access-xj2tr\") pod \"faefbbf5-3a2e-4938-821d-c5737c066de5\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.705070 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-swift-storage-0\") pod \"faefbbf5-3a2e-4938-821d-c5737c066de5\" (UID: \"faefbbf5-3a2e-4938-821d-c5737c066de5\") " Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.710637 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faefbbf5-3a2e-4938-821d-c5737c066de5-kube-api-access-xj2tr" (OuterVolumeSpecName: "kube-api-access-xj2tr") pod "faefbbf5-3a2e-4938-821d-c5737c066de5" (UID: "faefbbf5-3a2e-4938-821d-c5737c066de5"). InnerVolumeSpecName "kube-api-access-xj2tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.753685 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "faefbbf5-3a2e-4938-821d-c5737c066de5" (UID: "faefbbf5-3a2e-4938-821d-c5737c066de5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.754052 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "faefbbf5-3a2e-4938-821d-c5737c066de5" (UID: "faefbbf5-3a2e-4938-821d-c5737c066de5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.757841 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "faefbbf5-3a2e-4938-821d-c5737c066de5" (UID: "faefbbf5-3a2e-4938-821d-c5737c066de5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.761595 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "faefbbf5-3a2e-4938-821d-c5737c066de5" (UID: "faefbbf5-3a2e-4938-821d-c5737c066de5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.770608 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-config" (OuterVolumeSpecName: "config") pod "faefbbf5-3a2e-4938-821d-c5737c066de5" (UID: "faefbbf5-3a2e-4938-821d-c5737c066de5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.807750 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj2tr\" (UniqueName: \"kubernetes.io/projected/faefbbf5-3a2e-4938-821d-c5737c066de5-kube-api-access-xj2tr\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.807785 4779 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.807795 4779 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.807804 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.807814 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:17 crc kubenswrapper[4779]: I0320 15:49:17.807824 4779 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/faefbbf5-3a2e-4938-821d-c5737c066de5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:18 crc kubenswrapper[4779]: I0320 15:49:18.291778 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" event={"ID":"faefbbf5-3a2e-4938-821d-c5737c066de5","Type":"ContainerDied","Data":"9b0581edb29e1d0a13a2e2a01d5c019938a125e54ab3a0d72ecf2a226551e0df"} Mar 20 15:49:18 crc kubenswrapper[4779]: I0320 15:49:18.291830 4779 scope.go:117] "RemoveContainer" containerID="37ee607ade32a3a551d4da84eb72ab44b0a30e0ebbd98cf77c258470d9127201" Mar 20 15:49:18 crc kubenswrapper[4779]: I0320 15:49:18.291829 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-l62ft" Mar 20 15:49:18 crc kubenswrapper[4779]: I0320 15:49:18.317327 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l62ft"] Mar 20 15:49:18 crc kubenswrapper[4779]: I0320 15:49:18.326275 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-l62ft"] Mar 20 15:49:18 crc kubenswrapper[4779]: I0320 15:49:18.350608 4779 scope.go:117] "RemoveContainer" containerID="1ba3b85913f4db008c02d32f3870cd8e04888a09967b2c9cc6091e21377e1240" Mar 20 15:49:19 crc kubenswrapper[4779]: I0320 15:49:19.824219 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faefbbf5-3a2e-4938-821d-c5737c066de5" path="/var/lib/kubelet/pods/faefbbf5-3a2e-4938-821d-c5737c066de5/volumes" Mar 20 15:49:22 crc kubenswrapper[4779]: I0320 15:49:22.384572 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.660039 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks"] Mar 20 15:49:30 crc kubenswrapper[4779]: E0320 15:49:30.660966 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faefbbf5-3a2e-4938-821d-c5737c066de5" containerName="init" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.660979 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="faefbbf5-3a2e-4938-821d-c5737c066de5" containerName="init" Mar 20 15:49:30 crc kubenswrapper[4779]: E0320 15:49:30.660988 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faefbbf5-3a2e-4938-821d-c5737c066de5" containerName="dnsmasq-dns" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.660995 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="faefbbf5-3a2e-4938-821d-c5737c066de5" containerName="dnsmasq-dns" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.661246 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="faefbbf5-3a2e-4938-821d-c5737c066de5" containerName="dnsmasq-dns" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.662019 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.664061 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.664199 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.664869 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.671530 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.679254 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks"] Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.750937 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njp47\" (UniqueName: \"kubernetes.io/projected/e9abd3a1-8135-495d-b4f0-3f3569c40751-kube-api-access-njp47\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.751268 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.751294 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.751316 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.853082 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njp47\" (UniqueName: \"kubernetes.io/projected/e9abd3a1-8135-495d-b4f0-3f3569c40751-kube-api-access-njp47\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.853381 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.853468 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.853552 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.859393 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.864678 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.866961 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.870605 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njp47\" (UniqueName: \"kubernetes.io/projected/e9abd3a1-8135-495d-b4f0-3f3569c40751-kube-api-access-njp47\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62rks\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:30 crc kubenswrapper[4779]: I0320 15:49:30.994054 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:31 crc kubenswrapper[4779]: I0320 15:49:31.572650 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks"] Mar 20 15:49:32 crc kubenswrapper[4779]: I0320 15:49:32.465164 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" event={"ID":"e9abd3a1-8135-495d-b4f0-3f3569c40751","Type":"ContainerStarted","Data":"69523f89a0e8d68ffb50bb8787c6b40fa7d6bf9e09c5e6f6b950a1de458cc2af"} Mar 20 15:49:40 crc kubenswrapper[4779]: I0320 15:49:40.565296 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" event={"ID":"e9abd3a1-8135-495d-b4f0-3f3569c40751","Type":"ContainerStarted","Data":"138dfa6b24aa6c5d3ffb455ff0550cc00045101dcced6fa46e29f810bba525b7"} Mar 20 15:49:40 crc kubenswrapper[4779]: I0320 15:49:40.590427 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" podStartSLOduration=1.994952041 podStartE2EDuration="10.590409286s" podCreationTimestamp="2026-03-20 15:49:30 +0000 UTC" firstStartedPulling="2026-03-20 15:49:31.589896439 +0000 UTC m=+1588.552412239" lastFinishedPulling="2026-03-20 15:49:40.185353684 +0000 UTC m=+1597.147869484" observedRunningTime="2026-03-20 15:49:40.581685274 +0000 UTC m=+1597.544201074" watchObservedRunningTime="2026-03-20 15:49:40.590409286 +0000 UTC m=+1597.552925076" Mar 20 15:49:41 crc kubenswrapper[4779]: I0320 15:49:41.576866 4779 generic.go:334] "Generic (PLEG): container finished" podID="9bba48d4-65a1-44b2-b750-6a7f27613e63" containerID="cb7e0778cefa2126238da17c687240e69ad8423ce705113da7e5d3ab058b75c1" exitCode=0 Mar 20 15:49:41 crc kubenswrapper[4779]: I0320 15:49:41.576961 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9bba48d4-65a1-44b2-b750-6a7f27613e63","Type":"ContainerDied","Data":"cb7e0778cefa2126238da17c687240e69ad8423ce705113da7e5d3ab058b75c1"} Mar 20 15:49:41 crc kubenswrapper[4779]: I0320 15:49:41.580465 4779 generic.go:334] "Generic (PLEG): container finished" podID="a91eab41-69fe-44e7-a239-6956a6b18dd8" containerID="a4783c5bb903e2a362edcb88cc45882c445d58fd5b62a92b02ffa2caab7d13bb" exitCode=0 Mar 20 15:49:41 crc kubenswrapper[4779]: I0320 15:49:41.580539 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a91eab41-69fe-44e7-a239-6956a6b18dd8","Type":"ContainerDied","Data":"a4783c5bb903e2a362edcb88cc45882c445d58fd5b62a92b02ffa2caab7d13bb"} Mar 20 15:49:42 crc kubenswrapper[4779]: I0320 15:49:42.592040 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9bba48d4-65a1-44b2-b750-6a7f27613e63","Type":"ContainerStarted","Data":"326d74abe22fae33e8e14663dcb450b7798672d170111f70283a3c2325c970ef"} Mar 20 15:49:42 crc kubenswrapper[4779]: I0320 15:49:42.592733 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 15:49:42 crc kubenswrapper[4779]: I0320 15:49:42.596253 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a91eab41-69fe-44e7-a239-6956a6b18dd8","Type":"ContainerStarted","Data":"4f62ad7202daab2fe04ce4a4ad7cd83a99c33cd43f588a4714d29524bd850f35"} Mar 20 15:49:42 crc kubenswrapper[4779]: I0320 15:49:42.596607 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:42 crc kubenswrapper[4779]: I0320 15:49:42.621460 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.621439113 podStartE2EDuration="36.621439113s" podCreationTimestamp="2026-03-20 15:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:49:42.614282309 +0000 UTC m=+1599.576798139" watchObservedRunningTime="2026-03-20 15:49:42.621439113 +0000 UTC m=+1599.583954913" Mar 20 15:49:42 crc kubenswrapper[4779]: I0320 15:49:42.645573 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.645552211 podStartE2EDuration="36.645552211s" podCreationTimestamp="2026-03-20 15:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:49:42.636894879 +0000 UTC m=+1599.599410679" watchObservedRunningTime="2026-03-20 15:49:42.645552211 +0000 UTC m=+1599.608068011" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.315051 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hpnxv"] Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.318141 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.330238 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hpnxv"] Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.412535 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-utilities\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.412978 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-catalog-content\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.413043 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv7m2\" (UniqueName: \"kubernetes.io/projected/1da59552-78ce-4bd1-96ad-241704222736-kube-api-access-cv7m2\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.516466 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv7m2\" (UniqueName: \"kubernetes.io/projected/1da59552-78ce-4bd1-96ad-241704222736-kube-api-access-cv7m2\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.516774 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-utilities\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.516817 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-catalog-content\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.517558 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-catalog-content\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.517668 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-utilities\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.541050 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv7m2\" (UniqueName: \"kubernetes.io/projected/1da59552-78ce-4bd1-96ad-241704222736-kube-api-access-cv7m2\") pod \"community-operators-hpnxv\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:46 crc kubenswrapper[4779]: I0320 15:49:46.650672 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:47 crc kubenswrapper[4779]: I0320 15:49:47.181220 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hpnxv"] Mar 20 15:49:47 crc kubenswrapper[4779]: I0320 15:49:47.651269 4779 generic.go:334] "Generic (PLEG): container finished" podID="1da59552-78ce-4bd1-96ad-241704222736" containerID="613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c" exitCode=0 Mar 20 15:49:47 crc kubenswrapper[4779]: I0320 15:49:47.651364 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpnxv" event={"ID":"1da59552-78ce-4bd1-96ad-241704222736","Type":"ContainerDied","Data":"613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c"} Mar 20 15:49:47 crc kubenswrapper[4779]: I0320 15:49:47.651523 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpnxv" event={"ID":"1da59552-78ce-4bd1-96ad-241704222736","Type":"ContainerStarted","Data":"96370a92d7bb7724d65aaf2da7bbb62a1f08e2754c049b96b0c3d034b95b115b"} Mar 20 15:49:48 crc kubenswrapper[4779]: I0320 15:49:48.682627 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpnxv" event={"ID":"1da59552-78ce-4bd1-96ad-241704222736","Type":"ContainerStarted","Data":"39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a"} Mar 20 15:49:48 crc kubenswrapper[4779]: E0320 15:49:48.821653 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1da59552_78ce_4bd1_96ad_241704222736.slice/crio-39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:49:49 crc kubenswrapper[4779]: I0320 15:49:49.695333 4779 generic.go:334] "Generic (PLEG): container finished" podID="1da59552-78ce-4bd1-96ad-241704222736" containerID="39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a" exitCode=0 Mar 20 15:49:49 crc kubenswrapper[4779]: I0320 15:49:49.695395 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpnxv" event={"ID":"1da59552-78ce-4bd1-96ad-241704222736","Type":"ContainerDied","Data":"39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a"} Mar 20 15:49:50 crc kubenswrapper[4779]: I0320 15:49:50.707414 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpnxv" event={"ID":"1da59552-78ce-4bd1-96ad-241704222736","Type":"ContainerStarted","Data":"e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1"} Mar 20 15:49:50 crc kubenswrapper[4779]: I0320 15:49:50.725564 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hpnxv" podStartSLOduration=2.0967281 podStartE2EDuration="4.725545648s" podCreationTimestamp="2026-03-20 15:49:46 +0000 UTC" firstStartedPulling="2026-03-20 15:49:47.653206654 +0000 UTC m=+1604.615722454" lastFinishedPulling="2026-03-20 15:49:50.282024202 +0000 UTC m=+1607.244540002" observedRunningTime="2026-03-20 15:49:50.724828281 +0000 UTC m=+1607.687344101" watchObservedRunningTime="2026-03-20 15:49:50.725545648 +0000 UTC m=+1607.688061448" Mar 20 15:49:51 crc kubenswrapper[4779]: I0320 15:49:51.717266 4779 generic.go:334] "Generic (PLEG): container finished" podID="e9abd3a1-8135-495d-b4f0-3f3569c40751" containerID="138dfa6b24aa6c5d3ffb455ff0550cc00045101dcced6fa46e29f810bba525b7" exitCode=0 Mar 20 15:49:51 crc kubenswrapper[4779]: I0320 15:49:51.717365 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" event={"ID":"e9abd3a1-8135-495d-b4f0-3f3569c40751","Type":"ContainerDied","Data":"138dfa6b24aa6c5d3ffb455ff0550cc00045101dcced6fa46e29f810bba525b7"} Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.138466 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.270331 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-inventory\") pod \"e9abd3a1-8135-495d-b4f0-3f3569c40751\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.270753 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-repo-setup-combined-ca-bundle\") pod \"e9abd3a1-8135-495d-b4f0-3f3569c40751\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.270824 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-ssh-key-openstack-edpm-ipam\") pod \"e9abd3a1-8135-495d-b4f0-3f3569c40751\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.270949 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njp47\" (UniqueName: \"kubernetes.io/projected/e9abd3a1-8135-495d-b4f0-3f3569c40751-kube-api-access-njp47\") pod \"e9abd3a1-8135-495d-b4f0-3f3569c40751\" (UID: \"e9abd3a1-8135-495d-b4f0-3f3569c40751\") " Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.277201 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e9abd3a1-8135-495d-b4f0-3f3569c40751" (UID: "e9abd3a1-8135-495d-b4f0-3f3569c40751"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.277299 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9abd3a1-8135-495d-b4f0-3f3569c40751-kube-api-access-njp47" (OuterVolumeSpecName: "kube-api-access-njp47") pod "e9abd3a1-8135-495d-b4f0-3f3569c40751" (UID: "e9abd3a1-8135-495d-b4f0-3f3569c40751"). InnerVolumeSpecName "kube-api-access-njp47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.299604 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-inventory" (OuterVolumeSpecName: "inventory") pod "e9abd3a1-8135-495d-b4f0-3f3569c40751" (UID: "e9abd3a1-8135-495d-b4f0-3f3569c40751"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.312714 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9abd3a1-8135-495d-b4f0-3f3569c40751" (UID: "e9abd3a1-8135-495d-b4f0-3f3569c40751"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.373922 4779 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.373969 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.373984 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njp47\" (UniqueName: \"kubernetes.io/projected/e9abd3a1-8135-495d-b4f0-3f3569c40751-kube-api-access-njp47\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.374023 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9abd3a1-8135-495d-b4f0-3f3569c40751-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.738401 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" event={"ID":"e9abd3a1-8135-495d-b4f0-3f3569c40751","Type":"ContainerDied","Data":"69523f89a0e8d68ffb50bb8787c6b40fa7d6bf9e09c5e6f6b950a1de458cc2af"} Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.738448 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69523f89a0e8d68ffb50bb8787c6b40fa7d6bf9e09c5e6f6b950a1de458cc2af" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.738464 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62rks" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.807622 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz"] Mar 20 15:49:53 crc kubenswrapper[4779]: E0320 15:49:53.808025 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9abd3a1-8135-495d-b4f0-3f3569c40751" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.808040 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9abd3a1-8135-495d-b4f0-3f3569c40751" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.808222 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9abd3a1-8135-495d-b4f0-3f3569c40751" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.815561 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.823054 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.823131 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.823185 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.823138 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.833441 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz"] Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.883123 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.883262 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.883293 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhx8x\" (UniqueName: \"kubernetes.io/projected/c6abe891-2e21-41f3-a2eb-738a62807090-kube-api-access-mhx8x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.985828 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.985924 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.985956 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhx8x\" (UniqueName: \"kubernetes.io/projected/c6abe891-2e21-41f3-a2eb-738a62807090-kube-api-access-mhx8x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.990509 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:53 crc kubenswrapper[4779]: I0320 15:49:53.990579 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:54 crc kubenswrapper[4779]: I0320 15:49:54.003082 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhx8x\" (UniqueName: \"kubernetes.io/projected/c6abe891-2e21-41f3-a2eb-738a62807090-kube-api-access-mhx8x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zlfwz\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:54 crc kubenswrapper[4779]: I0320 15:49:54.135622 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:54 crc kubenswrapper[4779]: I0320 15:49:54.654524 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz"] Mar 20 15:49:54 crc kubenswrapper[4779]: I0320 15:49:54.748349 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" event={"ID":"c6abe891-2e21-41f3-a2eb-738a62807090","Type":"ContainerStarted","Data":"279c1bd5882b7c71fa788a606800bd225acdcb5f2df4ece066d1e9e121b56f50"} Mar 20 15:49:55 crc kubenswrapper[4779]: I0320 15:49:55.149563 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:49:55 crc kubenswrapper[4779]: I0320 15:49:55.149920 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:49:55 crc kubenswrapper[4779]: I0320 15:49:55.758858 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" event={"ID":"c6abe891-2e21-41f3-a2eb-738a62807090","Type":"ContainerStarted","Data":"cf3446155fd1fc5039da52b7575b6b927479624ec3a9aa60109f6a289c0edd2b"} Mar 20 15:49:55 crc kubenswrapper[4779]: I0320 15:49:55.781585 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" podStartSLOduration=2.33391453 podStartE2EDuration="2.781565058s" podCreationTimestamp="2026-03-20 15:49:53 +0000 UTC" firstStartedPulling="2026-03-20 15:49:54.658170726 +0000 UTC m=+1611.620686526" lastFinishedPulling="2026-03-20 15:49:55.105821244 +0000 UTC m=+1612.068337054" observedRunningTime="2026-03-20 15:49:55.772600641 +0000 UTC m=+1612.735116441" watchObservedRunningTime="2026-03-20 15:49:55.781565058 +0000 UTC m=+1612.744080858" Mar 20 15:49:56 crc kubenswrapper[4779]: I0320 15:49:56.650815 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:56 crc kubenswrapper[4779]: I0320 15:49:56.651188 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:56 crc kubenswrapper[4779]: I0320 15:49:56.695980 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:56 crc kubenswrapper[4779]: I0320 15:49:56.702322 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:49:56 crc kubenswrapper[4779]: I0320 15:49:56.832657 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:56 crc kubenswrapper[4779]: I0320 15:49:56.946936 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hpnxv"] Mar 20 15:49:57 crc kubenswrapper[4779]: I0320 15:49:57.012380 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 15:49:57 crc kubenswrapper[4779]: I0320 15:49:57.778551 4779 generic.go:334] "Generic (PLEG): container finished" podID="c6abe891-2e21-41f3-a2eb-738a62807090" containerID="cf3446155fd1fc5039da52b7575b6b927479624ec3a9aa60109f6a289c0edd2b" exitCode=0 Mar 20 15:49:57 crc kubenswrapper[4779]: I0320 15:49:57.778642 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" event={"ID":"c6abe891-2e21-41f3-a2eb-738a62807090","Type":"ContainerDied","Data":"cf3446155fd1fc5039da52b7575b6b927479624ec3a9aa60109f6a289c0edd2b"} Mar 20 15:49:58 crc kubenswrapper[4779]: I0320 15:49:58.786386 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hpnxv" podUID="1da59552-78ce-4bd1-96ad-241704222736" containerName="registry-server" containerID="cri-o://e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1" gracePeriod=2 Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.261799 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.389572 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.408923 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhx8x\" (UniqueName: \"kubernetes.io/projected/c6abe891-2e21-41f3-a2eb-738a62807090-kube-api-access-mhx8x\") pod \"c6abe891-2e21-41f3-a2eb-738a62807090\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.409024 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-ssh-key-openstack-edpm-ipam\") pod \"c6abe891-2e21-41f3-a2eb-738a62807090\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.409089 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-inventory\") pod \"c6abe891-2e21-41f3-a2eb-738a62807090\" (UID: \"c6abe891-2e21-41f3-a2eb-738a62807090\") " Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.416272 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6abe891-2e21-41f3-a2eb-738a62807090-kube-api-access-mhx8x" (OuterVolumeSpecName: "kube-api-access-mhx8x") pod "c6abe891-2e21-41f3-a2eb-738a62807090" (UID: "c6abe891-2e21-41f3-a2eb-738a62807090"). InnerVolumeSpecName "kube-api-access-mhx8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.442720 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c6abe891-2e21-41f3-a2eb-738a62807090" (UID: "c6abe891-2e21-41f3-a2eb-738a62807090"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.443878 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-inventory" (OuterVolumeSpecName: "inventory") pod "c6abe891-2e21-41f3-a2eb-738a62807090" (UID: "c6abe891-2e21-41f3-a2eb-738a62807090"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.510508 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv7m2\" (UniqueName: \"kubernetes.io/projected/1da59552-78ce-4bd1-96ad-241704222736-kube-api-access-cv7m2\") pod \"1da59552-78ce-4bd1-96ad-241704222736\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.510666 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-utilities\") pod \"1da59552-78ce-4bd1-96ad-241704222736\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.510732 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-catalog-content\") pod \"1da59552-78ce-4bd1-96ad-241704222736\" (UID: \"1da59552-78ce-4bd1-96ad-241704222736\") " Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.511517 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-utilities" (OuterVolumeSpecName: "utilities") pod "1da59552-78ce-4bd1-96ad-241704222736" (UID: "1da59552-78ce-4bd1-96ad-241704222736"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.511547 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhx8x\" (UniqueName: \"kubernetes.io/projected/c6abe891-2e21-41f3-a2eb-738a62807090-kube-api-access-mhx8x\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.511564 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.511575 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6abe891-2e21-41f3-a2eb-738a62807090-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.513656 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da59552-78ce-4bd1-96ad-241704222736-kube-api-access-cv7m2" (OuterVolumeSpecName: "kube-api-access-cv7m2") pod "1da59552-78ce-4bd1-96ad-241704222736" (UID: "1da59552-78ce-4bd1-96ad-241704222736"). InnerVolumeSpecName "kube-api-access-cv7m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.563028 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1da59552-78ce-4bd1-96ad-241704222736" (UID: "1da59552-78ce-4bd1-96ad-241704222736"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.612872 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.612908 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv7m2\" (UniqueName: \"kubernetes.io/projected/1da59552-78ce-4bd1-96ad-241704222736-kube-api-access-cv7m2\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.612919 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1da59552-78ce-4bd1-96ad-241704222736-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.795812 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" event={"ID":"c6abe891-2e21-41f3-a2eb-738a62807090","Type":"ContainerDied","Data":"279c1bd5882b7c71fa788a606800bd225acdcb5f2df4ece066d1e9e121b56f50"} Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.795859 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="279c1bd5882b7c71fa788a606800bd225acdcb5f2df4ece066d1e9e121b56f50" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.795827 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zlfwz" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.798024 4779 generic.go:334] "Generic (PLEG): container finished" podID="1da59552-78ce-4bd1-96ad-241704222736" containerID="e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1" exitCode=0 Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.798069 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpnxv" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.798083 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpnxv" event={"ID":"1da59552-78ce-4bd1-96ad-241704222736","Type":"ContainerDied","Data":"e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1"} Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.798362 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpnxv" event={"ID":"1da59552-78ce-4bd1-96ad-241704222736","Type":"ContainerDied","Data":"96370a92d7bb7724d65aaf2da7bbb62a1f08e2754c049b96b0c3d034b95b115b"} Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.798386 4779 scope.go:117] "RemoveContainer" containerID="e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.826518 4779 scope.go:117] "RemoveContainer" containerID="39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.867783 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hpnxv"] Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.874213 4779 scope.go:117] "RemoveContainer" containerID="613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.882886 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hpnxv"] Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.899846 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2"] Mar 20 15:49:59 crc kubenswrapper[4779]: E0320 15:49:59.900302 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da59552-78ce-4bd1-96ad-241704222736" containerName="registry-server" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.900313 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da59552-78ce-4bd1-96ad-241704222736" containerName="registry-server" Mar 20 15:49:59 crc kubenswrapper[4779]: E0320 15:49:59.900339 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da59552-78ce-4bd1-96ad-241704222736" containerName="extract-content" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.900346 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da59552-78ce-4bd1-96ad-241704222736" containerName="extract-content" Mar 20 15:49:59 crc kubenswrapper[4779]: E0320 15:49:59.900368 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6abe891-2e21-41f3-a2eb-738a62807090" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.900376 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6abe891-2e21-41f3-a2eb-738a62807090" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 15:49:59 crc kubenswrapper[4779]: E0320 15:49:59.900390 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da59552-78ce-4bd1-96ad-241704222736" containerName="extract-utilities" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.900396 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da59552-78ce-4bd1-96ad-241704222736" containerName="extract-utilities" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.900591 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6abe891-2e21-41f3-a2eb-738a62807090" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.900600 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da59552-78ce-4bd1-96ad-241704222736" containerName="registry-server" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.901430 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.901485 4779 scope.go:117] "RemoveContainer" containerID="e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1" Mar 20 15:49:59 crc kubenswrapper[4779]: E0320 15:49:59.902267 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1\": container with ID starting with e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1 not found: ID does not exist" containerID="e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.902303 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1"} err="failed to get container status \"e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1\": rpc error: code = NotFound desc = could not find container \"e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1\": container with ID starting with e61018d5ec6c990007f032875265ea099fed7d0edfd6f2cb0a94a3da3907a1f1 not found: ID does not exist" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.902330 4779 scope.go:117] "RemoveContainer" containerID="39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a" Mar 20 15:49:59 crc kubenswrapper[4779]: E0320 15:49:59.902684 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a\": container with ID starting with 39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a not found: ID does not exist" containerID="39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.902735 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a"} err="failed to get container status \"39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a\": rpc error: code = NotFound desc = could not find container \"39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a\": container with ID starting with 39b215b7813f3107876e107fa821a59e4c5972c2c920554b03e566c2636e509a not found: ID does not exist" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.902752 4779 scope.go:117] "RemoveContainer" containerID="613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c" Mar 20 15:49:59 crc kubenswrapper[4779]: E0320 15:49:59.903105 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c\": container with ID starting with 613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c not found: ID does not exist" containerID="613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.903332 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c"} err="failed to get container status \"613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c\": rpc error: code = NotFound desc = could not find container \"613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c\": container with ID starting with 613e1bdb76bc9b4ac936dbf2ecdbe02bff98a42f683741c77d18e259ed8fb33c not found: ID does not exist" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.903861 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.904147 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.904400 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.904590 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:49:59 crc kubenswrapper[4779]: I0320 15:49:59.907524 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2"] Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.022798 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.023243 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.023294 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9bf\" (UniqueName: \"kubernetes.io/projected/b4aee649-af38-4064-ba29-bf2837b4c652-kube-api-access-bx9bf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.023571 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.125192 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.125323 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.125382 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.125421 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9bf\" (UniqueName: \"kubernetes.io/projected/b4aee649-af38-4064-ba29-bf2837b4c652-kube-api-access-bx9bf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.129594 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.129651 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.130362 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.135005 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567030-vbjsm"] Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.136780 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-vbjsm" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.139598 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.139756 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.139758 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.148626 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-vbjsm"] Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.149638 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9bf\" (UniqueName: \"kubernetes.io/projected/b4aee649-af38-4064-ba29-bf2837b4c652-kube-api-access-bx9bf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.227403 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v46dc\" (UniqueName: \"kubernetes.io/projected/153ea542-9d02-48ac-a53a-2fb66716e9a8-kube-api-access-v46dc\") pod \"auto-csr-approver-29567030-vbjsm\" (UID: \"153ea542-9d02-48ac-a53a-2fb66716e9a8\") " pod="openshift-infra/auto-csr-approver-29567030-vbjsm" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.258605 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.329384 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46dc\" (UniqueName: \"kubernetes.io/projected/153ea542-9d02-48ac-a53a-2fb66716e9a8-kube-api-access-v46dc\") pod \"auto-csr-approver-29567030-vbjsm\" (UID: \"153ea542-9d02-48ac-a53a-2fb66716e9a8\") " pod="openshift-infra/auto-csr-approver-29567030-vbjsm" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.345830 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v46dc\" (UniqueName: \"kubernetes.io/projected/153ea542-9d02-48ac-a53a-2fb66716e9a8-kube-api-access-v46dc\") pod \"auto-csr-approver-29567030-vbjsm\" (UID: \"153ea542-9d02-48ac-a53a-2fb66716e9a8\") " pod="openshift-infra/auto-csr-approver-29567030-vbjsm" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.550953 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-vbjsm" Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.770869 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2"] Mar 20 15:50:00 crc kubenswrapper[4779]: W0320 15:50:00.780847 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4aee649_af38_4064_ba29_bf2837b4c652.slice/crio-1a5aa7b09fee91231cfd2a1753377860e6698a1b3e3e7fd921bd95be9db87871 WatchSource:0}: Error finding container 1a5aa7b09fee91231cfd2a1753377860e6698a1b3e3e7fd921bd95be9db87871: Status 404 returned error can't find the container with id 1a5aa7b09fee91231cfd2a1753377860e6698a1b3e3e7fd921bd95be9db87871 Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.810489 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" event={"ID":"b4aee649-af38-4064-ba29-bf2837b4c652","Type":"ContainerStarted","Data":"1a5aa7b09fee91231cfd2a1753377860e6698a1b3e3e7fd921bd95be9db87871"} Mar 20 15:50:00 crc kubenswrapper[4779]: I0320 15:50:00.988623 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-vbjsm"] Mar 20 15:50:01 crc kubenswrapper[4779]: I0320 15:50:01.819513 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da59552-78ce-4bd1-96ad-241704222736" path="/var/lib/kubelet/pods/1da59552-78ce-4bd1-96ad-241704222736/volumes" Mar 20 15:50:01 crc kubenswrapper[4779]: I0320 15:50:01.822761 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-vbjsm" event={"ID":"153ea542-9d02-48ac-a53a-2fb66716e9a8","Type":"ContainerStarted","Data":"73938e39b2156f6e3ec1a08ca388520439b989728798c5c6fde1d479235e283d"} Mar 20 15:50:01 crc kubenswrapper[4779]: I0320 15:50:01.824331 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" event={"ID":"b4aee649-af38-4064-ba29-bf2837b4c652","Type":"ContainerStarted","Data":"faedbd71c8b1600c65d53773c10d04e2703f8f0530df6d90f26abc191b10857b"} Mar 20 15:50:01 crc kubenswrapper[4779]: I0320 15:50:01.841649 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" podStartSLOduration=2.103233171 podStartE2EDuration="2.84163216s" podCreationTimestamp="2026-03-20 15:49:59 +0000 UTC" firstStartedPulling="2026-03-20 15:50:00.783956985 +0000 UTC m=+1617.746472785" lastFinishedPulling="2026-03-20 15:50:01.522355974 +0000 UTC m=+1618.484871774" observedRunningTime="2026-03-20 15:50:01.836957975 +0000 UTC m=+1618.799473775" watchObservedRunningTime="2026-03-20 15:50:01.84163216 +0000 UTC m=+1618.804147960" Mar 20 15:50:02 crc kubenswrapper[4779]: I0320 15:50:02.836946 4779 generic.go:334] "Generic (PLEG): container finished" podID="153ea542-9d02-48ac-a53a-2fb66716e9a8" containerID="f1b4871a28aa275fa3995a1a419e178a6ee58b083aa0f9d572b3bb2150f489d7" exitCode=0 Mar 20 15:50:02 crc kubenswrapper[4779]: I0320 15:50:02.837059 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-vbjsm" event={"ID":"153ea542-9d02-48ac-a53a-2fb66716e9a8","Type":"ContainerDied","Data":"f1b4871a28aa275fa3995a1a419e178a6ee58b083aa0f9d572b3bb2150f489d7"} Mar 20 15:50:04 crc kubenswrapper[4779]: I0320 15:50:04.258604 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-vbjsm" Mar 20 15:50:04 crc kubenswrapper[4779]: I0320 15:50:04.411190 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v46dc\" (UniqueName: \"kubernetes.io/projected/153ea542-9d02-48ac-a53a-2fb66716e9a8-kube-api-access-v46dc\") pod \"153ea542-9d02-48ac-a53a-2fb66716e9a8\" (UID: \"153ea542-9d02-48ac-a53a-2fb66716e9a8\") " Mar 20 15:50:04 crc kubenswrapper[4779]: I0320 15:50:04.416347 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/153ea542-9d02-48ac-a53a-2fb66716e9a8-kube-api-access-v46dc" (OuterVolumeSpecName: "kube-api-access-v46dc") pod "153ea542-9d02-48ac-a53a-2fb66716e9a8" (UID: "153ea542-9d02-48ac-a53a-2fb66716e9a8"). InnerVolumeSpecName "kube-api-access-v46dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:50:04 crc kubenswrapper[4779]: I0320 15:50:04.513535 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v46dc\" (UniqueName: \"kubernetes.io/projected/153ea542-9d02-48ac-a53a-2fb66716e9a8-kube-api-access-v46dc\") on node \"crc\" DevicePath \"\"" Mar 20 15:50:04 crc kubenswrapper[4779]: I0320 15:50:04.856585 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-vbjsm" event={"ID":"153ea542-9d02-48ac-a53a-2fb66716e9a8","Type":"ContainerDied","Data":"73938e39b2156f6e3ec1a08ca388520439b989728798c5c6fde1d479235e283d"} Mar 20 15:50:04 crc kubenswrapper[4779]: I0320 15:50:04.856631 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73938e39b2156f6e3ec1a08ca388520439b989728798c5c6fde1d479235e283d" Mar 20 15:50:04 crc kubenswrapper[4779]: I0320 15:50:04.856694 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-vbjsm" Mar 20 15:50:05 crc kubenswrapper[4779]: I0320 15:50:05.336433 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-tqhpt"] Mar 20 15:50:05 crc kubenswrapper[4779]: I0320 15:50:05.347950 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-tqhpt"] Mar 20 15:50:05 crc kubenswrapper[4779]: I0320 15:50:05.821935 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2558371-6693-4c25-bb36-d1688f299d44" path="/var/lib/kubelet/pods/c2558371-6693-4c25-bb36-d1688f299d44/volumes" Mar 20 15:50:25 crc kubenswrapper[4779]: I0320 15:50:25.150176 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:50:25 crc kubenswrapper[4779]: I0320 15:50:25.150781 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:50:40 crc kubenswrapper[4779]: I0320 15:50:40.330367 4779 scope.go:117] "RemoveContainer" containerID="49ae739bd70117adbae916a5cab791713c3f3b366468b63a172e726189c84e44" Mar 20 15:50:40 crc kubenswrapper[4779]: I0320 15:50:40.359297 4779 scope.go:117] "RemoveContainer" containerID="bcef4752595fdef6bceac9367420fcfcfa7be7ceb5bae95350caf122f82a4121" Mar 20 15:50:40 crc kubenswrapper[4779]: I0320 15:50:40.396330 4779 scope.go:117] "RemoveContainer" containerID="1b6689e48f0ff3a3d5ec9f393bb9833ee3462e60f3f553a28aabe9ea0f5ca99a" Mar 20 15:50:40 crc kubenswrapper[4779]: I0320 15:50:40.427853 4779 scope.go:117] "RemoveContainer" containerID="bfb335566f42e0303c8f99f7c83d08913087b0fdfc66b0529a5c94bd4b1f70c6" Mar 20 15:50:40 crc kubenswrapper[4779]: I0320 15:50:40.489989 4779 scope.go:117] "RemoveContainer" containerID="c93c405ab70db47f0521cdb98a2a3492fe28b54768e6357098cd25942e865203" Mar 20 15:50:40 crc kubenswrapper[4779]: I0320 15:50:40.523586 4779 scope.go:117] "RemoveContainer" containerID="91c11bb54a8057aea0a9ab54904ee35bce3c21141f649fc9f72262c2d26d65b7" Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.150044 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.150632 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.150679 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.151501 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.151558 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" gracePeriod=600 Mar 20 15:50:55 crc kubenswrapper[4779]: E0320 15:50:55.271919 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.307232 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" exitCode=0 Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.307342 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f"} Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.307906 4779 scope.go:117] "RemoveContainer" containerID="7b4abfdbc7440efebe366f165047da802f1778ef342a7861f64638e51f96112d" Mar 20 15:50:55 crc kubenswrapper[4779]: I0320 15:50:55.309226 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:50:55 crc kubenswrapper[4779]: E0320 15:50:55.309751 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:51:06 crc kubenswrapper[4779]: I0320 15:51:06.798459 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kps6b"] Mar 20 15:51:06 crc kubenswrapper[4779]: E0320 15:51:06.799496 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="153ea542-9d02-48ac-a53a-2fb66716e9a8" containerName="oc" Mar 20 15:51:06 crc kubenswrapper[4779]: I0320 15:51:06.799511 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="153ea542-9d02-48ac-a53a-2fb66716e9a8" containerName="oc" Mar 20 15:51:06 crc kubenswrapper[4779]: I0320 15:51:06.799768 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="153ea542-9d02-48ac-a53a-2fb66716e9a8" containerName="oc" Mar 20 15:51:06 crc kubenswrapper[4779]: I0320 15:51:06.801528 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:06 crc kubenswrapper[4779]: I0320 15:51:06.807921 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kps6b"] Mar 20 15:51:06 crc kubenswrapper[4779]: I0320 15:51:06.931160 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-catalog-content\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:06 crc kubenswrapper[4779]: I0320 15:51:06.931286 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-utilities\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:06 crc kubenswrapper[4779]: I0320 15:51:06.931345 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pds6\" (UniqueName: \"kubernetes.io/projected/4934ef50-c174-4e54-8bd2-74537c29d785-kube-api-access-2pds6\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:07 crc kubenswrapper[4779]: I0320 15:51:07.032952 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pds6\" (UniqueName: \"kubernetes.io/projected/4934ef50-c174-4e54-8bd2-74537c29d785-kube-api-access-2pds6\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:07 crc kubenswrapper[4779]: I0320 15:51:07.033134 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-catalog-content\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:07 crc kubenswrapper[4779]: I0320 15:51:07.033202 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-utilities\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:07 crc kubenswrapper[4779]: I0320 15:51:07.033800 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-utilities\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:07 crc kubenswrapper[4779]: I0320 15:51:07.034049 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-catalog-content\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:07 crc kubenswrapper[4779]: I0320 15:51:07.058083 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pds6\" (UniqueName: \"kubernetes.io/projected/4934ef50-c174-4e54-8bd2-74537c29d785-kube-api-access-2pds6\") pod \"certified-operators-kps6b\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:07 crc kubenswrapper[4779]: I0320 15:51:07.122705 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:07 crc kubenswrapper[4779]: I0320 15:51:07.638905 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kps6b"] Mar 20 15:51:08 crc kubenswrapper[4779]: I0320 15:51:08.422867 4779 generic.go:334] "Generic (PLEG): container finished" podID="4934ef50-c174-4e54-8bd2-74537c29d785" containerID="b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8" exitCode=0 Mar 20 15:51:08 crc kubenswrapper[4779]: I0320 15:51:08.422967 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kps6b" event={"ID":"4934ef50-c174-4e54-8bd2-74537c29d785","Type":"ContainerDied","Data":"b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8"} Mar 20 15:51:08 crc kubenswrapper[4779]: I0320 15:51:08.423185 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kps6b" event={"ID":"4934ef50-c174-4e54-8bd2-74537c29d785","Type":"ContainerStarted","Data":"be4962784af53fbe0ec7df80cc8e06a6cf5693da83caec9f5e4e820b4dee29ee"} Mar 20 15:51:08 crc kubenswrapper[4779]: I0320 15:51:08.808402 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:51:08 crc kubenswrapper[4779]: E0320 15:51:08.808950 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:51:09 crc kubenswrapper[4779]: I0320 15:51:09.434011 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kps6b" event={"ID":"4934ef50-c174-4e54-8bd2-74537c29d785","Type":"ContainerStarted","Data":"492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c"} Mar 20 15:51:10 crc kubenswrapper[4779]: I0320 15:51:10.443249 4779 generic.go:334] "Generic (PLEG): container finished" podID="4934ef50-c174-4e54-8bd2-74537c29d785" containerID="492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c" exitCode=0 Mar 20 15:51:10 crc kubenswrapper[4779]: I0320 15:51:10.443290 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kps6b" event={"ID":"4934ef50-c174-4e54-8bd2-74537c29d785","Type":"ContainerDied","Data":"492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c"} Mar 20 15:51:11 crc kubenswrapper[4779]: I0320 15:51:11.454331 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kps6b" event={"ID":"4934ef50-c174-4e54-8bd2-74537c29d785","Type":"ContainerStarted","Data":"6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c"} Mar 20 15:51:11 crc kubenswrapper[4779]: I0320 15:51:11.479989 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kps6b" podStartSLOduration=2.749360384 podStartE2EDuration="5.479971699s" podCreationTimestamp="2026-03-20 15:51:06 +0000 UTC" firstStartedPulling="2026-03-20 15:51:08.42453414 +0000 UTC m=+1685.387049940" lastFinishedPulling="2026-03-20 15:51:11.155145455 +0000 UTC m=+1688.117661255" observedRunningTime="2026-03-20 15:51:11.473237253 +0000 UTC m=+1688.435753043" watchObservedRunningTime="2026-03-20 15:51:11.479971699 +0000 UTC m=+1688.442487499" Mar 20 15:51:17 crc kubenswrapper[4779]: I0320 15:51:17.124389 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:17 crc kubenswrapper[4779]: I0320 15:51:17.124846 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:17 crc kubenswrapper[4779]: I0320 15:51:17.174374 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:17 crc kubenswrapper[4779]: I0320 15:51:17.552040 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:17 crc kubenswrapper[4779]: I0320 15:51:17.595428 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kps6b"] Mar 20 15:51:19 crc kubenswrapper[4779]: I0320 15:51:19.523873 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kps6b" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" containerName="registry-server" containerID="cri-o://6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c" gracePeriod=2 Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.008797 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.085499 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-utilities\") pod \"4934ef50-c174-4e54-8bd2-74537c29d785\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.085704 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-catalog-content\") pod \"4934ef50-c174-4e54-8bd2-74537c29d785\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.085740 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pds6\" (UniqueName: \"kubernetes.io/projected/4934ef50-c174-4e54-8bd2-74537c29d785-kube-api-access-2pds6\") pod \"4934ef50-c174-4e54-8bd2-74537c29d785\" (UID: \"4934ef50-c174-4e54-8bd2-74537c29d785\") " Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.086641 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-utilities" (OuterVolumeSpecName: "utilities") pod "4934ef50-c174-4e54-8bd2-74537c29d785" (UID: "4934ef50-c174-4e54-8bd2-74537c29d785"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.090839 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4934ef50-c174-4e54-8bd2-74537c29d785-kube-api-access-2pds6" (OuterVolumeSpecName: "kube-api-access-2pds6") pod "4934ef50-c174-4e54-8bd2-74537c29d785" (UID: "4934ef50-c174-4e54-8bd2-74537c29d785"). InnerVolumeSpecName "kube-api-access-2pds6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.138731 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4934ef50-c174-4e54-8bd2-74537c29d785" (UID: "4934ef50-c174-4e54-8bd2-74537c29d785"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.187698 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.187733 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pds6\" (UniqueName: \"kubernetes.io/projected/4934ef50-c174-4e54-8bd2-74537c29d785-kube-api-access-2pds6\") on node \"crc\" DevicePath \"\"" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.187744 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4934ef50-c174-4e54-8bd2-74537c29d785-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.533500 4779 generic.go:334] "Generic (PLEG): container finished" podID="4934ef50-c174-4e54-8bd2-74537c29d785" containerID="6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c" exitCode=0 Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.533541 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kps6b" event={"ID":"4934ef50-c174-4e54-8bd2-74537c29d785","Type":"ContainerDied","Data":"6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c"} Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.533577 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kps6b" event={"ID":"4934ef50-c174-4e54-8bd2-74537c29d785","Type":"ContainerDied","Data":"be4962784af53fbe0ec7df80cc8e06a6cf5693da83caec9f5e4e820b4dee29ee"} Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.533577 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kps6b" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.533596 4779 scope.go:117] "RemoveContainer" containerID="6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.560098 4779 scope.go:117] "RemoveContainer" containerID="492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.573599 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kps6b"] Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.585872 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kps6b"] Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.589521 4779 scope.go:117] "RemoveContainer" containerID="b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.628740 4779 scope.go:117] "RemoveContainer" containerID="6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c" Mar 20 15:51:20 crc kubenswrapper[4779]: E0320 15:51:20.629388 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c\": container with ID starting with 6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c not found: ID does not exist" containerID="6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.629427 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c"} err="failed to get container status \"6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c\": rpc error: code = NotFound desc = could not find container \"6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c\": container with ID starting with 6182c9838270e358b3791b4777d9a260f880e7c1b553eb874da8b8e9fafc2f2c not found: ID does not exist" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.629454 4779 scope.go:117] "RemoveContainer" containerID="492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c" Mar 20 15:51:20 crc kubenswrapper[4779]: E0320 15:51:20.629717 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c\": container with ID starting with 492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c not found: ID does not exist" containerID="492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.629750 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c"} err="failed to get container status \"492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c\": rpc error: code = NotFound desc = could not find container \"492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c\": container with ID starting with 492cc44741a1f4b2382a16f023a458360609982a5d568a90e6f813d1e2db259c not found: ID does not exist" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.629768 4779 scope.go:117] "RemoveContainer" containerID="b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8" Mar 20 15:51:20 crc kubenswrapper[4779]: E0320 15:51:20.630375 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8\": container with ID starting with b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8 not found: ID does not exist" containerID="b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.630402 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8"} err="failed to get container status \"b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8\": rpc error: code = NotFound desc = could not find container \"b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8\": container with ID starting with b20eb099cd7cb1167f2b839afbcd6363992d8d0b02606e28b8bb03ac2df443b8 not found: ID does not exist" Mar 20 15:51:20 crc kubenswrapper[4779]: I0320 15:51:20.808722 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:51:20 crc kubenswrapper[4779]: E0320 15:51:20.809055 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:51:21 crc kubenswrapper[4779]: I0320 15:51:21.819871 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" path="/var/lib/kubelet/pods/4934ef50-c174-4e54-8bd2-74537c29d785/volumes" Mar 20 15:51:35 crc kubenswrapper[4779]: I0320 15:51:35.809012 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:51:35 crc kubenswrapper[4779]: E0320 15:51:35.809899 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.650713 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hjfbk"] Mar 20 15:51:37 crc kubenswrapper[4779]: E0320 15:51:37.651424 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" containerName="extract-utilities" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.651438 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" containerName="extract-utilities" Mar 20 15:51:37 crc kubenswrapper[4779]: E0320 15:51:37.651449 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" containerName="extract-content" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.651455 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" containerName="extract-content" Mar 20 15:51:37 crc kubenswrapper[4779]: E0320 15:51:37.651486 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" containerName="registry-server" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.651491 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" containerName="registry-server" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.651665 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="4934ef50-c174-4e54-8bd2-74537c29d785" containerName="registry-server" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.653074 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.665070 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjfbk"] Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.735598 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzrpf\" (UniqueName: \"kubernetes.io/projected/6089c443-1639-4b96-bc89-2f610ede383a-kube-api-access-jzrpf\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.735675 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-catalog-content\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.735804 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-utilities\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.838340 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-utilities\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.838502 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzrpf\" (UniqueName: \"kubernetes.io/projected/6089c443-1639-4b96-bc89-2f610ede383a-kube-api-access-jzrpf\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.838569 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-catalog-content\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.839077 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-catalog-content\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.839454 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-utilities\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.858489 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzrpf\" (UniqueName: \"kubernetes.io/projected/6089c443-1639-4b96-bc89-2f610ede383a-kube-api-access-jzrpf\") pod \"redhat-marketplace-hjfbk\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:37 crc kubenswrapper[4779]: I0320 15:51:37.978701 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:38 crc kubenswrapper[4779]: I0320 15:51:38.442724 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjfbk"] Mar 20 15:51:38 crc kubenswrapper[4779]: I0320 15:51:38.825134 4779 generic.go:334] "Generic (PLEG): container finished" podID="6089c443-1639-4b96-bc89-2f610ede383a" containerID="7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9" exitCode=0 Mar 20 15:51:38 crc kubenswrapper[4779]: I0320 15:51:38.825175 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjfbk" event={"ID":"6089c443-1639-4b96-bc89-2f610ede383a","Type":"ContainerDied","Data":"7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9"} Mar 20 15:51:38 crc kubenswrapper[4779]: I0320 15:51:38.825228 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjfbk" event={"ID":"6089c443-1639-4b96-bc89-2f610ede383a","Type":"ContainerStarted","Data":"0fb79aac3c9701175ddc06ac2d4280dbf60257cfa2ccbb7c0b774b2bdab7689b"} Mar 20 15:51:40 crc kubenswrapper[4779]: I0320 15:51:40.745318 4779 scope.go:117] "RemoveContainer" containerID="9b3010172e23c0e2fb22547ed52cdeebe05c5bb0ddf175c2f0118ece00124f45" Mar 20 15:51:40 crc kubenswrapper[4779]: I0320 15:51:40.766218 4779 scope.go:117] "RemoveContainer" containerID="a85465ebf729ed3710c438a3eba2e7951d3eff1d01d3855caf6846980f2c8f14" Mar 20 15:51:40 crc kubenswrapper[4779]: I0320 15:51:40.823397 4779 scope.go:117] "RemoveContainer" containerID="3a416b6e3b56fa65fe127241ccfe0961d7b327d2cea7fac1af288fdec90ddba0" Mar 20 15:51:40 crc kubenswrapper[4779]: I0320 15:51:40.840403 4779 scope.go:117] "RemoveContainer" containerID="b9138cce36f647241a32181c1827909361881d304cb66a73b0568ffc81e78207" Mar 20 15:51:40 crc kubenswrapper[4779]: I0320 15:51:40.858794 4779 generic.go:334] "Generic (PLEG): container finished" podID="6089c443-1639-4b96-bc89-2f610ede383a" containerID="721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3" exitCode=0 Mar 20 15:51:40 crc kubenswrapper[4779]: I0320 15:51:40.858911 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjfbk" event={"ID":"6089c443-1639-4b96-bc89-2f610ede383a","Type":"ContainerDied","Data":"721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3"} Mar 20 15:51:40 crc kubenswrapper[4779]: I0320 15:51:40.886160 4779 scope.go:117] "RemoveContainer" containerID="b7438c8703eed8876715f4b09a1a91fc80747e4a1d1da11173a1de34cd6894e0" Mar 20 15:51:40 crc kubenswrapper[4779]: I0320 15:51:40.910921 4779 scope.go:117] "RemoveContainer" containerID="256c73fa67ac5fd0452c6071ac8471ab9f080a3ab4ea4409a79d0c323ee82509" Mar 20 15:51:41 crc kubenswrapper[4779]: I0320 15:51:41.877397 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjfbk" event={"ID":"6089c443-1639-4b96-bc89-2f610ede383a","Type":"ContainerStarted","Data":"8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693"} Mar 20 15:51:41 crc kubenswrapper[4779]: I0320 15:51:41.903515 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hjfbk" podStartSLOduration=2.31737227 podStartE2EDuration="4.903492518s" podCreationTimestamp="2026-03-20 15:51:37 +0000 UTC" firstStartedPulling="2026-03-20 15:51:38.827017578 +0000 UTC m=+1715.789533378" lastFinishedPulling="2026-03-20 15:51:41.413137826 +0000 UTC m=+1718.375653626" observedRunningTime="2026-03-20 15:51:41.89791762 +0000 UTC m=+1718.860433420" watchObservedRunningTime="2026-03-20 15:51:41.903492518 +0000 UTC m=+1718.866008348" Mar 20 15:51:46 crc kubenswrapper[4779]: I0320 15:51:46.808769 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:51:46 crc kubenswrapper[4779]: E0320 15:51:46.809586 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:51:47 crc kubenswrapper[4779]: I0320 15:51:47.978918 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:47 crc kubenswrapper[4779]: I0320 15:51:47.979406 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:48 crc kubenswrapper[4779]: I0320 15:51:48.031342 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:48 crc kubenswrapper[4779]: I0320 15:51:48.992958 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:49 crc kubenswrapper[4779]: I0320 15:51:49.045982 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjfbk"] Mar 20 15:51:50 crc kubenswrapper[4779]: I0320 15:51:50.961428 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hjfbk" podUID="6089c443-1639-4b96-bc89-2f610ede383a" containerName="registry-server" containerID="cri-o://8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693" gracePeriod=2 Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.387677 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.496929 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzrpf\" (UniqueName: \"kubernetes.io/projected/6089c443-1639-4b96-bc89-2f610ede383a-kube-api-access-jzrpf\") pod \"6089c443-1639-4b96-bc89-2f610ede383a\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.497164 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-utilities\") pod \"6089c443-1639-4b96-bc89-2f610ede383a\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.497210 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-catalog-content\") pod \"6089c443-1639-4b96-bc89-2f610ede383a\" (UID: \"6089c443-1639-4b96-bc89-2f610ede383a\") " Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.500835 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-utilities" (OuterVolumeSpecName: "utilities") pod "6089c443-1639-4b96-bc89-2f610ede383a" (UID: "6089c443-1639-4b96-bc89-2f610ede383a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.502632 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6089c443-1639-4b96-bc89-2f610ede383a-kube-api-access-jzrpf" (OuterVolumeSpecName: "kube-api-access-jzrpf") pod "6089c443-1639-4b96-bc89-2f610ede383a" (UID: "6089c443-1639-4b96-bc89-2f610ede383a"). InnerVolumeSpecName "kube-api-access-jzrpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.524898 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6089c443-1639-4b96-bc89-2f610ede383a" (UID: "6089c443-1639-4b96-bc89-2f610ede383a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.599295 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzrpf\" (UniqueName: \"kubernetes.io/projected/6089c443-1639-4b96-bc89-2f610ede383a-kube-api-access-jzrpf\") on node \"crc\" DevicePath \"\"" Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.599328 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.599337 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6089c443-1639-4b96-bc89-2f610ede383a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.986922 4779 generic.go:334] "Generic (PLEG): container finished" podID="6089c443-1639-4b96-bc89-2f610ede383a" containerID="8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693" exitCode=0 Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.987098 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjfbk" event={"ID":"6089c443-1639-4b96-bc89-2f610ede383a","Type":"ContainerDied","Data":"8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693"} Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.987275 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjfbk" event={"ID":"6089c443-1639-4b96-bc89-2f610ede383a","Type":"ContainerDied","Data":"0fb79aac3c9701175ddc06ac2d4280dbf60257cfa2ccbb7c0b774b2bdab7689b"} Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.987292 4779 scope.go:117] "RemoveContainer" containerID="8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693" Mar 20 15:51:51 crc kubenswrapper[4779]: I0320 15:51:51.987193 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hjfbk" Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.022821 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjfbk"] Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.022912 4779 scope.go:117] "RemoveContainer" containerID="721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3" Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.030628 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjfbk"] Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.055646 4779 scope.go:117] "RemoveContainer" containerID="7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9" Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.102659 4779 scope.go:117] "RemoveContainer" containerID="8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693" Mar 20 15:51:52 crc kubenswrapper[4779]: E0320 15:51:52.110598 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693\": container with ID starting with 8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693 not found: ID does not exist" containerID="8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693" Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.110652 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693"} err="failed to get container status \"8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693\": rpc error: code = NotFound desc = could not find container \"8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693\": container with ID starting with 8d761942c7aa39c03a38efc786db999484019e220f025b2b809332cfd415f693 not found: ID does not exist" Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.110683 4779 scope.go:117] "RemoveContainer" containerID="721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3" Mar 20 15:51:52 crc kubenswrapper[4779]: E0320 15:51:52.111159 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3\": container with ID starting with 721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3 not found: ID does not exist" containerID="721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3" Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.111208 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3"} err="failed to get container status \"721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3\": rpc error: code = NotFound desc = could not find container \"721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3\": container with ID starting with 721ca040af4d805a553838912013d48608d2beb3ae743566b45bf82f485eebf3 not found: ID does not exist" Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.111242 4779 scope.go:117] "RemoveContainer" containerID="7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9" Mar 20 15:51:52 crc kubenswrapper[4779]: E0320 15:51:52.111525 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9\": container with ID starting with 7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9 not found: ID does not exist" containerID="7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9" Mar 20 15:51:52 crc kubenswrapper[4779]: I0320 15:51:52.111557 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9"} err="failed to get container status \"7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9\": rpc error: code = NotFound desc = could not find container \"7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9\": container with ID starting with 7bb4c6edaa475220c2dd6b5920d823d0e2a2a62b852fefadd568da4bf462e2f9 not found: ID does not exist" Mar 20 15:51:53 crc kubenswrapper[4779]: I0320 15:51:53.822486 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6089c443-1639-4b96-bc89-2f610ede383a" path="/var/lib/kubelet/pods/6089c443-1639-4b96-bc89-2f610ede383a/volumes" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.142590 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567032-md578"] Mar 20 15:52:00 crc kubenswrapper[4779]: E0320 15:52:00.143640 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6089c443-1639-4b96-bc89-2f610ede383a" containerName="extract-content" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.143654 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6089c443-1639-4b96-bc89-2f610ede383a" containerName="extract-content" Mar 20 15:52:00 crc kubenswrapper[4779]: E0320 15:52:00.143669 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6089c443-1639-4b96-bc89-2f610ede383a" containerName="extract-utilities" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.143675 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6089c443-1639-4b96-bc89-2f610ede383a" containerName="extract-utilities" Mar 20 15:52:00 crc kubenswrapper[4779]: E0320 15:52:00.143709 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6089c443-1639-4b96-bc89-2f610ede383a" containerName="registry-server" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.143715 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6089c443-1639-4b96-bc89-2f610ede383a" containerName="registry-server" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.143882 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6089c443-1639-4b96-bc89-2f610ede383a" containerName="registry-server" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.144642 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-md578" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.146948 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.147186 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.147330 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.185448 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-md578"] Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.261304 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6qwt\" (UniqueName: \"kubernetes.io/projected/4eeef9a4-9be4-4067-a699-d2503ce4acc7-kube-api-access-r6qwt\") pod \"auto-csr-approver-29567032-md578\" (UID: \"4eeef9a4-9be4-4067-a699-d2503ce4acc7\") " pod="openshift-infra/auto-csr-approver-29567032-md578" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.363447 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6qwt\" (UniqueName: \"kubernetes.io/projected/4eeef9a4-9be4-4067-a699-d2503ce4acc7-kube-api-access-r6qwt\") pod \"auto-csr-approver-29567032-md578\" (UID: \"4eeef9a4-9be4-4067-a699-d2503ce4acc7\") " pod="openshift-infra/auto-csr-approver-29567032-md578" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.389035 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6qwt\" (UniqueName: \"kubernetes.io/projected/4eeef9a4-9be4-4067-a699-d2503ce4acc7-kube-api-access-r6qwt\") pod \"auto-csr-approver-29567032-md578\" (UID: \"4eeef9a4-9be4-4067-a699-d2503ce4acc7\") " pod="openshift-infra/auto-csr-approver-29567032-md578" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.497095 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-md578" Mar 20 15:52:00 crc kubenswrapper[4779]: I0320 15:52:00.939703 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-md578"] Mar 20 15:52:01 crc kubenswrapper[4779]: I0320 15:52:01.088252 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-md578" event={"ID":"4eeef9a4-9be4-4067-a699-d2503ce4acc7","Type":"ContainerStarted","Data":"01da4827d84e5abd55513ec47cdbe8e0012607f53bc62e3f1dbee68f681001bc"} Mar 20 15:52:01 crc kubenswrapper[4779]: I0320 15:52:01.808737 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:52:01 crc kubenswrapper[4779]: E0320 15:52:01.809098 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:52:03 crc kubenswrapper[4779]: I0320 15:52:03.104139 4779 generic.go:334] "Generic (PLEG): container finished" podID="4eeef9a4-9be4-4067-a699-d2503ce4acc7" containerID="b08b706ad2c830ce0fc817f6743fe7bb019e2bd157e4d82791a91a1279f9fb99" exitCode=0 Mar 20 15:52:03 crc kubenswrapper[4779]: I0320 15:52:03.104255 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-md578" event={"ID":"4eeef9a4-9be4-4067-a699-d2503ce4acc7","Type":"ContainerDied","Data":"b08b706ad2c830ce0fc817f6743fe7bb019e2bd157e4d82791a91a1279f9fb99"} Mar 20 15:52:04 crc kubenswrapper[4779]: I0320 15:52:04.412591 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-md578" Mar 20 15:52:04 crc kubenswrapper[4779]: I0320 15:52:04.572473 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6qwt\" (UniqueName: \"kubernetes.io/projected/4eeef9a4-9be4-4067-a699-d2503ce4acc7-kube-api-access-r6qwt\") pod \"4eeef9a4-9be4-4067-a699-d2503ce4acc7\" (UID: \"4eeef9a4-9be4-4067-a699-d2503ce4acc7\") " Mar 20 15:52:04 crc kubenswrapper[4779]: I0320 15:52:04.577605 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eeef9a4-9be4-4067-a699-d2503ce4acc7-kube-api-access-r6qwt" (OuterVolumeSpecName: "kube-api-access-r6qwt") pod "4eeef9a4-9be4-4067-a699-d2503ce4acc7" (UID: "4eeef9a4-9be4-4067-a699-d2503ce4acc7"). InnerVolumeSpecName "kube-api-access-r6qwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:52:04 crc kubenswrapper[4779]: I0320 15:52:04.674726 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6qwt\" (UniqueName: \"kubernetes.io/projected/4eeef9a4-9be4-4067-a699-d2503ce4acc7-kube-api-access-r6qwt\") on node \"crc\" DevicePath \"\"" Mar 20 15:52:05 crc kubenswrapper[4779]: I0320 15:52:05.122969 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-md578" event={"ID":"4eeef9a4-9be4-4067-a699-d2503ce4acc7","Type":"ContainerDied","Data":"01da4827d84e5abd55513ec47cdbe8e0012607f53bc62e3f1dbee68f681001bc"} Mar 20 15:52:05 crc kubenswrapper[4779]: I0320 15:52:05.123270 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01da4827d84e5abd55513ec47cdbe8e0012607f53bc62e3f1dbee68f681001bc" Mar 20 15:52:05 crc kubenswrapper[4779]: I0320 15:52:05.123030 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-md578" Mar 20 15:52:05 crc kubenswrapper[4779]: I0320 15:52:05.476909 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-95m4d"] Mar 20 15:52:05 crc kubenswrapper[4779]: I0320 15:52:05.489328 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-95m4d"] Mar 20 15:52:05 crc kubenswrapper[4779]: I0320 15:52:05.819385 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05082fa7-f09c-4834-a745-aae20aad01c2" path="/var/lib/kubelet/pods/05082fa7-f09c-4834-a745-aae20aad01c2/volumes" Mar 20 15:52:13 crc kubenswrapper[4779]: I0320 15:52:13.823590 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:52:13 crc kubenswrapper[4779]: E0320 15:52:13.824815 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:52:24 crc kubenswrapper[4779]: I0320 15:52:24.809098 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:52:24 crc kubenswrapper[4779]: E0320 15:52:24.810568 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:52:37 crc kubenswrapper[4779]: I0320 15:52:37.809721 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:52:37 crc kubenswrapper[4779]: E0320 15:52:37.810542 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:52:40 crc kubenswrapper[4779]: I0320 15:52:40.999515 4779 scope.go:117] "RemoveContainer" containerID="0373be9754c9e92f88604eccd132d95ef493ab8316875ca38271501b25050b55" Mar 20 15:52:41 crc kubenswrapper[4779]: I0320 15:52:41.031387 4779 scope.go:117] "RemoveContainer" containerID="3c989f1298a779f1a951476723cd69c30850eb7c6bf3b8650d83df1962657960" Mar 20 15:52:41 crc kubenswrapper[4779]: I0320 15:52:41.083212 4779 scope.go:117] "RemoveContainer" containerID="a0297bcf430d9ce16e4c26f3b558f8f63d3533d0e0b3c7fde55a9b7c06babdb3" Mar 20 15:52:41 crc kubenswrapper[4779]: I0320 15:52:41.122393 4779 scope.go:117] "RemoveContainer" containerID="add76e1f005987e3456f39bbc59e62069e531c7bbae1999bb2f0d57a0ba6e43b" Mar 20 15:52:49 crc kubenswrapper[4779]: I0320 15:52:49.809208 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:52:49 crc kubenswrapper[4779]: E0320 15:52:49.810335 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:53:03 crc kubenswrapper[4779]: I0320 15:53:03.815022 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:53:03 crc kubenswrapper[4779]: E0320 15:53:03.815880 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:53:18 crc kubenswrapper[4779]: I0320 15:53:18.808787 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:53:18 crc kubenswrapper[4779]: E0320 15:53:18.809791 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:53:33 crc kubenswrapper[4779]: I0320 15:53:33.817063 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:53:33 crc kubenswrapper[4779]: E0320 15:53:33.818061 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:53:37 crc kubenswrapper[4779]: I0320 15:53:37.988483 4779 generic.go:334] "Generic (PLEG): container finished" podID="b4aee649-af38-4064-ba29-bf2837b4c652" containerID="faedbd71c8b1600c65d53773c10d04e2703f8f0530df6d90f26abc191b10857b" exitCode=0 Mar 20 15:53:37 crc kubenswrapper[4779]: I0320 15:53:37.988569 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" event={"ID":"b4aee649-af38-4064-ba29-bf2837b4c652","Type":"ContainerDied","Data":"faedbd71c8b1600c65d53773c10d04e2703f8f0530df6d90f26abc191b10857b"} Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.392062 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.476665 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-ssh-key-openstack-edpm-ipam\") pod \"b4aee649-af38-4064-ba29-bf2837b4c652\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.476728 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx9bf\" (UniqueName: \"kubernetes.io/projected/b4aee649-af38-4064-ba29-bf2837b4c652-kube-api-access-bx9bf\") pod \"b4aee649-af38-4064-ba29-bf2837b4c652\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.476810 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-inventory\") pod \"b4aee649-af38-4064-ba29-bf2837b4c652\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.476945 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-bootstrap-combined-ca-bundle\") pod \"b4aee649-af38-4064-ba29-bf2837b4c652\" (UID: \"b4aee649-af38-4064-ba29-bf2837b4c652\") " Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.482215 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4aee649-af38-4064-ba29-bf2837b4c652-kube-api-access-bx9bf" (OuterVolumeSpecName: "kube-api-access-bx9bf") pod "b4aee649-af38-4064-ba29-bf2837b4c652" (UID: "b4aee649-af38-4064-ba29-bf2837b4c652"). InnerVolumeSpecName "kube-api-access-bx9bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.482572 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b4aee649-af38-4064-ba29-bf2837b4c652" (UID: "b4aee649-af38-4064-ba29-bf2837b4c652"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.504218 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b4aee649-af38-4064-ba29-bf2837b4c652" (UID: "b4aee649-af38-4064-ba29-bf2837b4c652"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.504529 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-inventory" (OuterVolumeSpecName: "inventory") pod "b4aee649-af38-4064-ba29-bf2837b4c652" (UID: "b4aee649-af38-4064-ba29-bf2837b4c652"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.581610 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.581679 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx9bf\" (UniqueName: \"kubernetes.io/projected/b4aee649-af38-4064-ba29-bf2837b4c652-kube-api-access-bx9bf\") on node \"crc\" DevicePath \"\"" Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.581689 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:53:39 crc kubenswrapper[4779]: I0320 15:53:39.581698 4779 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4aee649-af38-4064-ba29-bf2837b4c652-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.007418 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" event={"ID":"b4aee649-af38-4064-ba29-bf2837b4c652","Type":"ContainerDied","Data":"1a5aa7b09fee91231cfd2a1753377860e6698a1b3e3e7fd921bd95be9db87871"} Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.007454 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5aa7b09fee91231cfd2a1753377860e6698a1b3e3e7fd921bd95be9db87871" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.007948 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.097402 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv"] Mar 20 15:53:40 crc kubenswrapper[4779]: E0320 15:53:40.097798 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eeef9a4-9be4-4067-a699-d2503ce4acc7" containerName="oc" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.097816 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eeef9a4-9be4-4067-a699-d2503ce4acc7" containerName="oc" Mar 20 15:53:40 crc kubenswrapper[4779]: E0320 15:53:40.097837 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4aee649-af38-4064-ba29-bf2837b4c652" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.097846 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4aee649-af38-4064-ba29-bf2837b4c652" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.098058 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4aee649-af38-4064-ba29-bf2837b4c652" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.098074 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eeef9a4-9be4-4067-a699-d2503ce4acc7" containerName="oc" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.098704 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.101874 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.102620 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.102991 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.103440 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.110910 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv"] Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.193989 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.194183 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.194260 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm56m\" (UniqueName: \"kubernetes.io/projected/14a567a6-af89-4bcb-84ae-630e3546007f-kube-api-access-sm56m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.295693 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm56m\" (UniqueName: \"kubernetes.io/projected/14a567a6-af89-4bcb-84ae-630e3546007f-kube-api-access-sm56m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.296024 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.296186 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.303861 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.303915 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.313513 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm56m\" (UniqueName: \"kubernetes.io/projected/14a567a6-af89-4bcb-84ae-630e3546007f-kube-api-access-sm56m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkttv\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.414890 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.925034 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:53:40 crc kubenswrapper[4779]: I0320 15:53:40.925314 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv"] Mar 20 15:53:41 crc kubenswrapper[4779]: I0320 15:53:41.016673 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" event={"ID":"14a567a6-af89-4bcb-84ae-630e3546007f","Type":"ContainerStarted","Data":"b4ee0c17b67cb435a5f559df6758619e8a91556b0989c68f1d19193a02e3d4cb"} Mar 20 15:53:41 crc kubenswrapper[4779]: I0320 15:53:41.217466 4779 scope.go:117] "RemoveContainer" containerID="307555d734d8c31719459a6fab20efd0f6af775f1498d19e9deb098d8b393047" Mar 20 15:53:41 crc kubenswrapper[4779]: I0320 15:53:41.238908 4779 scope.go:117] "RemoveContainer" containerID="00b74be011bcf522ab460330ede2e52938a7476879db59b96a2d43702c76fbe0" Mar 20 15:53:41 crc kubenswrapper[4779]: I0320 15:53:41.267372 4779 scope.go:117] "RemoveContainer" containerID="111a91579794bf334607b1f80b67ccbdfe253399964cc75b1c32c8aa72a47067" Mar 20 15:53:41 crc kubenswrapper[4779]: I0320 15:53:41.329375 4779 scope.go:117] "RemoveContainer" containerID="8692e4e9961696eda85185d923fc97ed86ec7a5f13c5a8c7f13c503b1123ae33" Mar 20 15:53:41 crc kubenswrapper[4779]: I0320 15:53:41.346761 4779 scope.go:117] "RemoveContainer" containerID="afe8f6516ad048a4992b5f473eec161cd7e7e994a5c76022b8e45012f60efef5" Mar 20 15:53:41 crc kubenswrapper[4779]: I0320 15:53:41.367177 4779 scope.go:117] "RemoveContainer" containerID="e29cb9ade5d141418ca4cf96c71b910db359421b7e5f16c983fc8a236df516db" Mar 20 15:53:41 crc kubenswrapper[4779]: I0320 15:53:41.387765 4779 scope.go:117] "RemoveContainer" containerID="51386384fcec7d0366ed3b260c41104c17bf2c7e95563503d509ec6be330d132" Mar 20 15:53:42 crc kubenswrapper[4779]: I0320 15:53:42.025815 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" event={"ID":"14a567a6-af89-4bcb-84ae-630e3546007f","Type":"ContainerStarted","Data":"b34634df0f8d2ece05479f2e181004bddc7cc298546606aed942594774c4e5ff"} Mar 20 15:53:42 crc kubenswrapper[4779]: I0320 15:53:42.052821 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" podStartSLOduration=1.55477599 podStartE2EDuration="2.052804115s" podCreationTimestamp="2026-03-20 15:53:40 +0000 UTC" firstStartedPulling="2026-03-20 15:53:40.924820466 +0000 UTC m=+1837.887336266" lastFinishedPulling="2026-03-20 15:53:41.422848601 +0000 UTC m=+1838.385364391" observedRunningTime="2026-03-20 15:53:42.044607192 +0000 UTC m=+1839.007123002" watchObservedRunningTime="2026-03-20 15:53:42.052804115 +0000 UTC m=+1839.015319905" Mar 20 15:53:48 crc kubenswrapper[4779]: I0320 15:53:48.810223 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:53:48 crc kubenswrapper[4779]: E0320 15:53:48.810825 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:53:58 crc kubenswrapper[4779]: I0320 15:53:58.037039 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-87f8-account-create-update-lbzbd"] Mar 20 15:53:58 crc kubenswrapper[4779]: I0320 15:53:58.048549 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-87f8-account-create-update-lbzbd"] Mar 20 15:53:58 crc kubenswrapper[4779]: I0320 15:53:58.060158 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-g8sgs"] Mar 20 15:53:58 crc kubenswrapper[4779]: I0320 15:53:58.068642 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-g8sgs"] Mar 20 15:53:59 crc kubenswrapper[4779]: I0320 15:53:59.809419 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:53:59 crc kubenswrapper[4779]: E0320 15:53:59.809876 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:53:59 crc kubenswrapper[4779]: I0320 15:53:59.832497 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91031879-f65a-4ff0-a28a-94856f3db0aa" path="/var/lib/kubelet/pods/91031879-f65a-4ff0-a28a-94856f3db0aa/volumes" Mar 20 15:53:59 crc kubenswrapper[4779]: I0320 15:53:59.833560 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa24957f-e52d-40d0-9ed2-801617ddb2d7" path="/var/lib/kubelet/pods/aa24957f-e52d-40d0-9ed2-801617ddb2d7/volumes" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.144842 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567034-zqfhn"] Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.146679 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-zqfhn" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.149088 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.149301 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.149502 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.159652 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-zqfhn"] Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.262465 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl8z5\" (UniqueName: \"kubernetes.io/projected/a066553c-0cc4-4dd8-bd44-1d903896201e-kube-api-access-hl8z5\") pod \"auto-csr-approver-29567034-zqfhn\" (UID: \"a066553c-0cc4-4dd8-bd44-1d903896201e\") " pod="openshift-infra/auto-csr-approver-29567034-zqfhn" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.365234 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl8z5\" (UniqueName: \"kubernetes.io/projected/a066553c-0cc4-4dd8-bd44-1d903896201e-kube-api-access-hl8z5\") pod \"auto-csr-approver-29567034-zqfhn\" (UID: \"a066553c-0cc4-4dd8-bd44-1d903896201e\") " pod="openshift-infra/auto-csr-approver-29567034-zqfhn" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.384512 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl8z5\" (UniqueName: \"kubernetes.io/projected/a066553c-0cc4-4dd8-bd44-1d903896201e-kube-api-access-hl8z5\") pod \"auto-csr-approver-29567034-zqfhn\" (UID: \"a066553c-0cc4-4dd8-bd44-1d903896201e\") " pod="openshift-infra/auto-csr-approver-29567034-zqfhn" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.464539 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-zqfhn" Mar 20 15:54:00 crc kubenswrapper[4779]: I0320 15:54:00.877774 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-zqfhn"] Mar 20 15:54:01 crc kubenswrapper[4779]: I0320 15:54:01.026485 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-thctt"] Mar 20 15:54:01 crc kubenswrapper[4779]: I0320 15:54:01.038058 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-thctt"] Mar 20 15:54:01 crc kubenswrapper[4779]: I0320 15:54:01.210924 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-zqfhn" event={"ID":"a066553c-0cc4-4dd8-bd44-1d903896201e","Type":"ContainerStarted","Data":"dedf78fad06c0a120cee0a2081d262e33aa4c323dc9b0713b48abe2a26d10fa7"} Mar 20 15:54:01 crc kubenswrapper[4779]: I0320 15:54:01.818783 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c9cff9d-9412-4738-84a7-df2abae9a5c6" path="/var/lib/kubelet/pods/1c9cff9d-9412-4738-84a7-df2abae9a5c6/volumes" Mar 20 15:54:02 crc kubenswrapper[4779]: I0320 15:54:02.032743 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6fc0-account-create-update-5x6h6"] Mar 20 15:54:02 crc kubenswrapper[4779]: I0320 15:54:02.043223 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-b2xx8"] Mar 20 15:54:02 crc kubenswrapper[4779]: I0320 15:54:02.051241 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-1f58-account-create-update-mgl6d"] Mar 20 15:54:02 crc kubenswrapper[4779]: I0320 15:54:02.059001 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6fc0-account-create-update-5x6h6"] Mar 20 15:54:02 crc kubenswrapper[4779]: I0320 15:54:02.066407 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-b2xx8"] Mar 20 15:54:02 crc kubenswrapper[4779]: I0320 15:54:02.084879 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-1f58-account-create-update-mgl6d"] Mar 20 15:54:03 crc kubenswrapper[4779]: I0320 15:54:03.230097 4779 generic.go:334] "Generic (PLEG): container finished" podID="a066553c-0cc4-4dd8-bd44-1d903896201e" containerID="e8bfcd1b8c2a7f0a81258715645b51ff30c7d193f2b06ab847c178e334ff3a72" exitCode=0 Mar 20 15:54:03 crc kubenswrapper[4779]: I0320 15:54:03.230240 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-zqfhn" event={"ID":"a066553c-0cc4-4dd8-bd44-1d903896201e","Type":"ContainerDied","Data":"e8bfcd1b8c2a7f0a81258715645b51ff30c7d193f2b06ab847c178e334ff3a72"} Mar 20 15:54:03 crc kubenswrapper[4779]: I0320 15:54:03.821465 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="569982f0-2f9f-4e5c-8cc5-ea78916cda2c" path="/var/lib/kubelet/pods/569982f0-2f9f-4e5c-8cc5-ea78916cda2c/volumes" Mar 20 15:54:03 crc kubenswrapper[4779]: I0320 15:54:03.822655 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e05bbbd-ea13-4bd8-bf92-930e44dd8e91" path="/var/lib/kubelet/pods/9e05bbbd-ea13-4bd8-bf92-930e44dd8e91/volumes" Mar 20 15:54:03 crc kubenswrapper[4779]: I0320 15:54:03.823394 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e06123d4-03bd-44b8-bb15-eb3baccb8c90" path="/var/lib/kubelet/pods/e06123d4-03bd-44b8-bb15-eb3baccb8c90/volumes" Mar 20 15:54:04 crc kubenswrapper[4779]: I0320 15:54:04.566848 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-zqfhn" Mar 20 15:54:04 crc kubenswrapper[4779]: I0320 15:54:04.743982 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl8z5\" (UniqueName: \"kubernetes.io/projected/a066553c-0cc4-4dd8-bd44-1d903896201e-kube-api-access-hl8z5\") pod \"a066553c-0cc4-4dd8-bd44-1d903896201e\" (UID: \"a066553c-0cc4-4dd8-bd44-1d903896201e\") " Mar 20 15:54:04 crc kubenswrapper[4779]: I0320 15:54:04.750047 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a066553c-0cc4-4dd8-bd44-1d903896201e-kube-api-access-hl8z5" (OuterVolumeSpecName: "kube-api-access-hl8z5") pod "a066553c-0cc4-4dd8-bd44-1d903896201e" (UID: "a066553c-0cc4-4dd8-bd44-1d903896201e"). InnerVolumeSpecName "kube-api-access-hl8z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:54:04 crc kubenswrapper[4779]: I0320 15:54:04.846406 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl8z5\" (UniqueName: \"kubernetes.io/projected/a066553c-0cc4-4dd8-bd44-1d903896201e-kube-api-access-hl8z5\") on node \"crc\" DevicePath \"\"" Mar 20 15:54:05 crc kubenswrapper[4779]: I0320 15:54:05.252741 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-zqfhn" event={"ID":"a066553c-0cc4-4dd8-bd44-1d903896201e","Type":"ContainerDied","Data":"dedf78fad06c0a120cee0a2081d262e33aa4c323dc9b0713b48abe2a26d10fa7"} Mar 20 15:54:05 crc kubenswrapper[4779]: I0320 15:54:05.253505 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dedf78fad06c0a120cee0a2081d262e33aa4c323dc9b0713b48abe2a26d10fa7" Mar 20 15:54:05 crc kubenswrapper[4779]: I0320 15:54:05.252814 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-zqfhn" Mar 20 15:54:05 crc kubenswrapper[4779]: I0320 15:54:05.629074 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-jsjd7"] Mar 20 15:54:05 crc kubenswrapper[4779]: I0320 15:54:05.637278 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-jsjd7"] Mar 20 15:54:05 crc kubenswrapper[4779]: I0320 15:54:05.818761 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c7e4dbe-44fb-49cf-93c9-da8e6c247d01" path="/var/lib/kubelet/pods/6c7e4dbe-44fb-49cf-93c9-da8e6c247d01/volumes" Mar 20 15:54:07 crc kubenswrapper[4779]: I0320 15:54:07.028430 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2mg9r"] Mar 20 15:54:07 crc kubenswrapper[4779]: I0320 15:54:07.038948 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-df14-account-create-update-nfhsc"] Mar 20 15:54:07 crc kubenswrapper[4779]: I0320 15:54:07.047393 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2mg9r"] Mar 20 15:54:07 crc kubenswrapper[4779]: I0320 15:54:07.054684 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-df14-account-create-update-nfhsc"] Mar 20 15:54:07 crc kubenswrapper[4779]: I0320 15:54:07.819385 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32fe88a3-1d3d-4863-9abd-160e0537aff3" path="/var/lib/kubelet/pods/32fe88a3-1d3d-4863-9abd-160e0537aff3/volumes" Mar 20 15:54:07 crc kubenswrapper[4779]: I0320 15:54:07.820236 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da10206b-ede8-4373-af66-b442633908ab" path="/var/lib/kubelet/pods/da10206b-ede8-4373-af66-b442633908ab/volumes" Mar 20 15:54:10 crc kubenswrapper[4779]: I0320 15:54:10.809638 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:54:10 crc kubenswrapper[4779]: E0320 15:54:10.810231 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:54:13 crc kubenswrapper[4779]: I0320 15:54:13.029607 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5v4xs"] Mar 20 15:54:13 crc kubenswrapper[4779]: I0320 15:54:13.039441 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5v4xs"] Mar 20 15:54:13 crc kubenswrapper[4779]: I0320 15:54:13.817985 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4435049-2efc-449f-a5af-25747d3b64f0" path="/var/lib/kubelet/pods/f4435049-2efc-449f-a5af-25747d3b64f0/volumes" Mar 20 15:54:23 crc kubenswrapper[4779]: I0320 15:54:23.814639 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:54:23 crc kubenswrapper[4779]: E0320 15:54:23.815378 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:54:29 crc kubenswrapper[4779]: I0320 15:54:29.057310 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a5fd-account-create-update-9c6l6"] Mar 20 15:54:29 crc kubenswrapper[4779]: I0320 15:54:29.070365 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9a19-account-create-update-w5t6w"] Mar 20 15:54:29 crc kubenswrapper[4779]: I0320 15:54:29.082299 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a5fd-account-create-update-9c6l6"] Mar 20 15:54:29 crc kubenswrapper[4779]: I0320 15:54:29.090442 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9a19-account-create-update-w5t6w"] Mar 20 15:54:29 crc kubenswrapper[4779]: I0320 15:54:29.818706 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79528236-508c-451f-906a-7ad8b3f638ef" path="/var/lib/kubelet/pods/79528236-508c-451f-906a-7ad8b3f638ef/volumes" Mar 20 15:54:29 crc kubenswrapper[4779]: I0320 15:54:29.819626 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f226989b-bb85-41dc-8e16-631a0740361a" path="/var/lib/kubelet/pods/f226989b-bb85-41dc-8e16-631a0740361a/volumes" Mar 20 15:54:32 crc kubenswrapper[4779]: I0320 15:54:32.034892 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-pk9nl"] Mar 20 15:54:32 crc kubenswrapper[4779]: I0320 15:54:32.044365 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-pk9nl"] Mar 20 15:54:33 crc kubenswrapper[4779]: I0320 15:54:33.843515 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de7ca4b-8d4c-41be-bd83-f1430eefc89f" path="/var/lib/kubelet/pods/7de7ca4b-8d4c-41be-bd83-f1430eefc89f/volumes" Mar 20 15:54:38 crc kubenswrapper[4779]: I0320 15:54:38.809356 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:54:38 crc kubenswrapper[4779]: E0320 15:54:38.810249 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.031721 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a00e-account-create-update-5d2t5"] Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.041975 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-p6dg6"] Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.058691 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-p6dg6"] Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.067163 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6kdxq"] Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.076384 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a00e-account-create-update-5d2t5"] Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.088434 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6kdxq"] Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.095625 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qnd9s"] Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.104796 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qnd9s"] Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.824524 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1435ea5c-3bed-456c-8fe3-931542c325a1" path="/var/lib/kubelet/pods/1435ea5c-3bed-456c-8fe3-931542c325a1/volumes" Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.825196 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e10831a-7f1b-4066-bcf0-0c08ad7b16c0" path="/var/lib/kubelet/pods/3e10831a-7f1b-4066-bcf0-0c08ad7b16c0/volumes" Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.825705 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4db09a8d-78e6-403a-842d-4c138a9699be" path="/var/lib/kubelet/pods/4db09a8d-78e6-403a-842d-4c138a9699be/volumes" Mar 20 15:54:39 crc kubenswrapper[4779]: I0320 15:54:39.826216 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e18c51e-3afa-4453-9a88-ee43ca5e563b" path="/var/lib/kubelet/pods/6e18c51e-3afa-4453-9a88-ee43ca5e563b/volumes" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.515705 4779 scope.go:117] "RemoveContainer" containerID="0a7526452a975ab7b35d0acdaa562228f1c087830abbfed42ea3aad56d598b87" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.568080 4779 scope.go:117] "RemoveContainer" containerID="c4b96bc0c679406f602c994fa757d3f3a131ad1870068a502ee5e00895b647c1" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.592436 4779 scope.go:117] "RemoveContainer" containerID="752adf4cc33c5cf8f247ab3a7b196dbcf69f41be57776bf574f6b2db924d31fe" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.636905 4779 scope.go:117] "RemoveContainer" containerID="aad5ee9afcaca2ed8fc734ebf3ad22218d0376cef5b35e3c1191715887982e71" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.681892 4779 scope.go:117] "RemoveContainer" containerID="a5d4406fc1a0b7baef16968b2c9c31f567afa6d0ab1979413fd1a429a2a4f9f5" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.729985 4779 scope.go:117] "RemoveContainer" containerID="bc6b113c92083e1c01f30284fff16690d424220807531272f7e6efdcdd3d445f" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.785849 4779 scope.go:117] "RemoveContainer" containerID="b1c5e24756628eae2850f3fcd18bbb4aa44322d860bbcfa746337a7efdaae919" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.807020 4779 scope.go:117] "RemoveContainer" containerID="d8bd85d7a12fc6efa263fdfde508100ec19852d584988eb60e2807bbfd9dc50d" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.838028 4779 scope.go:117] "RemoveContainer" containerID="f35a9d9d10c366308d30779b5b2fecbb7218a713e4d946d4670f6eb98a512a62" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.858651 4779 scope.go:117] "RemoveContainer" containerID="62e7826c3ec39d2b90569426cf09df175d1630bd7138e093b85f3cb9b9c814ea" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.878290 4779 scope.go:117] "RemoveContainer" containerID="7d8cf03e162fb94ddfac7f060b641bb685784c40f3d4d99d58f7aba8e0cbd6e6" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.898420 4779 scope.go:117] "RemoveContainer" containerID="cac13aed11abd8b93611559eb0adcd3b3c0eeb8027c45a771a7596810263d283" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.921105 4779 scope.go:117] "RemoveContainer" containerID="241be15361d72400eb521684e1b9f80a0ae6daa87c1b1052b4f55e30968532e6" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.942957 4779 scope.go:117] "RemoveContainer" containerID="e6afca2124db1facd084e1f9c742aa482a672c0acc52b5110edf62421d80d853" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.968802 4779 scope.go:117] "RemoveContainer" containerID="e10fdb1ae97f49627d87219cc2d4d55509955185030eebc461451bfba3b1de77" Mar 20 15:54:41 crc kubenswrapper[4779]: I0320 15:54:41.987739 4779 scope.go:117] "RemoveContainer" containerID="8b6ee097cd8d183960d8372e42c44c5d0bf52f5f4df507889a27a1fdbcc44c8a" Mar 20 15:54:42 crc kubenswrapper[4779]: I0320 15:54:42.006247 4779 scope.go:117] "RemoveContainer" containerID="97cffcfd633899bd083a26cf432dcca771a912284fd2b178eaa78d5cd12e41cd" Mar 20 15:54:44 crc kubenswrapper[4779]: I0320 15:54:44.047282 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vkj28"] Mar 20 15:54:44 crc kubenswrapper[4779]: I0320 15:54:44.062466 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vkj28"] Mar 20 15:54:45 crc kubenswrapper[4779]: I0320 15:54:45.820469 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a288511-2254-442c-a931-bd59dd0a3b29" path="/var/lib/kubelet/pods/9a288511-2254-442c-a931-bd59dd0a3b29/volumes" Mar 20 15:54:53 crc kubenswrapper[4779]: I0320 15:54:53.819716 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:54:53 crc kubenswrapper[4779]: E0320 15:54:53.820590 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:55:06 crc kubenswrapper[4779]: I0320 15:55:06.808958 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:55:06 crc kubenswrapper[4779]: E0320 15:55:06.809718 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:55:09 crc kubenswrapper[4779]: I0320 15:55:09.831714 4779 generic.go:334] "Generic (PLEG): container finished" podID="14a567a6-af89-4bcb-84ae-630e3546007f" containerID="b34634df0f8d2ece05479f2e181004bddc7cc298546606aed942594774c4e5ff" exitCode=0 Mar 20 15:55:09 crc kubenswrapper[4779]: I0320 15:55:09.831802 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" event={"ID":"14a567a6-af89-4bcb-84ae-630e3546007f","Type":"ContainerDied","Data":"b34634df0f8d2ece05479f2e181004bddc7cc298546606aed942594774c4e5ff"} Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.294970 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.453035 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-inventory\") pod \"14a567a6-af89-4bcb-84ae-630e3546007f\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.453851 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm56m\" (UniqueName: \"kubernetes.io/projected/14a567a6-af89-4bcb-84ae-630e3546007f-kube-api-access-sm56m\") pod \"14a567a6-af89-4bcb-84ae-630e3546007f\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.454093 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-ssh-key-openstack-edpm-ipam\") pod \"14a567a6-af89-4bcb-84ae-630e3546007f\" (UID: \"14a567a6-af89-4bcb-84ae-630e3546007f\") " Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.459181 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a567a6-af89-4bcb-84ae-630e3546007f-kube-api-access-sm56m" (OuterVolumeSpecName: "kube-api-access-sm56m") pod "14a567a6-af89-4bcb-84ae-630e3546007f" (UID: "14a567a6-af89-4bcb-84ae-630e3546007f"). InnerVolumeSpecName "kube-api-access-sm56m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.479774 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-inventory" (OuterVolumeSpecName: "inventory") pod "14a567a6-af89-4bcb-84ae-630e3546007f" (UID: "14a567a6-af89-4bcb-84ae-630e3546007f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.482285 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "14a567a6-af89-4bcb-84ae-630e3546007f" (UID: "14a567a6-af89-4bcb-84ae-630e3546007f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.560326 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.560364 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm56m\" (UniqueName: \"kubernetes.io/projected/14a567a6-af89-4bcb-84ae-630e3546007f-kube-api-access-sm56m\") on node \"crc\" DevicePath \"\"" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.560378 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14a567a6-af89-4bcb-84ae-630e3546007f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.853934 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" event={"ID":"14a567a6-af89-4bcb-84ae-630e3546007f","Type":"ContainerDied","Data":"b4ee0c17b67cb435a5f559df6758619e8a91556b0989c68f1d19193a02e3d4cb"} Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.853976 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4ee0c17b67cb435a5f559df6758619e8a91556b0989c68f1d19193a02e3d4cb" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.854017 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkttv" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.944601 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96"] Mar 20 15:55:11 crc kubenswrapper[4779]: E0320 15:55:11.945006 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a066553c-0cc4-4dd8-bd44-1d903896201e" containerName="oc" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.945020 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="a066553c-0cc4-4dd8-bd44-1d903896201e" containerName="oc" Mar 20 15:55:11 crc kubenswrapper[4779]: E0320 15:55:11.945043 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a567a6-af89-4bcb-84ae-630e3546007f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.945051 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a567a6-af89-4bcb-84ae-630e3546007f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.945266 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="a066553c-0cc4-4dd8-bd44-1d903896201e" containerName="oc" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.945306 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a567a6-af89-4bcb-84ae-630e3546007f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.946626 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.953769 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.954046 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.968530 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.968619 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.968707 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx5hq\" (UniqueName: \"kubernetes.io/projected/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-kube-api-access-qx5hq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.970399 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.970695 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:55:11 crc kubenswrapper[4779]: I0320 15:55:11.973729 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96"] Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.070636 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx5hq\" (UniqueName: \"kubernetes.io/projected/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-kube-api-access-qx5hq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.070707 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.070781 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.077069 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.077367 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.090736 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx5hq\" (UniqueName: \"kubernetes.io/projected/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-kube-api-access-qx5hq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76p96\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.268229 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.793090 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96"] Mar 20 15:55:12 crc kubenswrapper[4779]: I0320 15:55:12.862668 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" event={"ID":"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb","Type":"ContainerStarted","Data":"a9be95a235c1d6e948e131ae87e21a3738992017b28b28cae01678330268e988"} Mar 20 15:55:13 crc kubenswrapper[4779]: I0320 15:55:13.870849 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" event={"ID":"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb","Type":"ContainerStarted","Data":"3634ad5b6e8a04bf4f56852d6064c64a6e2e1ce46ebb110a881d2c81f5af9f40"} Mar 20 15:55:13 crc kubenswrapper[4779]: I0320 15:55:13.897494 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" podStartSLOduration=2.434156088 podStartE2EDuration="2.897468921s" podCreationTimestamp="2026-03-20 15:55:11 +0000 UTC" firstStartedPulling="2026-03-20 15:55:12.786161073 +0000 UTC m=+1929.748676863" lastFinishedPulling="2026-03-20 15:55:13.249473896 +0000 UTC m=+1930.211989696" observedRunningTime="2026-03-20 15:55:13.886860628 +0000 UTC m=+1930.849376418" watchObservedRunningTime="2026-03-20 15:55:13.897468921 +0000 UTC m=+1930.859984721" Mar 20 15:55:15 crc kubenswrapper[4779]: I0320 15:55:15.038084 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-gwmgm"] Mar 20 15:55:15 crc kubenswrapper[4779]: I0320 15:55:15.046660 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-gwmgm"] Mar 20 15:55:15 crc kubenswrapper[4779]: I0320 15:55:15.819787 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35f1456-b579-484e-bbb4-75ced186fdda" path="/var/lib/kubelet/pods/d35f1456-b579-484e-bbb4-75ced186fdda/volumes" Mar 20 15:55:19 crc kubenswrapper[4779]: I0320 15:55:19.809568 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:55:19 crc kubenswrapper[4779]: E0320 15:55:19.809825 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:55:28 crc kubenswrapper[4779]: I0320 15:55:28.051815 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rh9gx"] Mar 20 15:55:28 crc kubenswrapper[4779]: I0320 15:55:28.064155 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mx4bn"] Mar 20 15:55:28 crc kubenswrapper[4779]: I0320 15:55:28.072206 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-4qv4b"] Mar 20 15:55:28 crc kubenswrapper[4779]: I0320 15:55:28.080055 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-4qv4b"] Mar 20 15:55:28 crc kubenswrapper[4779]: I0320 15:55:28.087568 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rh9gx"] Mar 20 15:55:28 crc kubenswrapper[4779]: I0320 15:55:28.095251 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mx4bn"] Mar 20 15:55:29 crc kubenswrapper[4779]: I0320 15:55:29.820946 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c5c01b-85a0-4714-81fe-31192c87b2fa" path="/var/lib/kubelet/pods/85c5c01b-85a0-4714-81fe-31192c87b2fa/volumes" Mar 20 15:55:29 crc kubenswrapper[4779]: I0320 15:55:29.822351 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d89bfb9-3cdc-4e93-899a-361f5d5cf408" path="/var/lib/kubelet/pods/8d89bfb9-3cdc-4e93-899a-361f5d5cf408/volumes" Mar 20 15:55:29 crc kubenswrapper[4779]: I0320 15:55:29.823045 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e815c919-fbc1-4307-912e-13c74f996abb" path="/var/lib/kubelet/pods/e815c919-fbc1-4307-912e-13c74f996abb/volumes" Mar 20 15:55:34 crc kubenswrapper[4779]: I0320 15:55:34.808604 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:55:34 crc kubenswrapper[4779]: E0320 15:55:34.809328 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:55:39 crc kubenswrapper[4779]: I0320 15:55:39.030551 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xjstn"] Mar 20 15:55:39 crc kubenswrapper[4779]: I0320 15:55:39.041344 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xjstn"] Mar 20 15:55:39 crc kubenswrapper[4779]: I0320 15:55:39.821734 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b67fdca5-13d1-4e83-8834-03856ca956b5" path="/var/lib/kubelet/pods/b67fdca5-13d1-4e83-8834-03856ca956b5/volumes" Mar 20 15:55:42 crc kubenswrapper[4779]: I0320 15:55:42.280789 4779 scope.go:117] "RemoveContainer" containerID="fcd9f636bf9ef211a785824ec24e9742f165fa40de9f5c35b3b32d8950658468" Mar 20 15:55:42 crc kubenswrapper[4779]: I0320 15:55:42.329158 4779 scope.go:117] "RemoveContainer" containerID="7a94844cceb07beea29a43faeb9b773344756eeb64c078c6cc8b1e72db3107e8" Mar 20 15:55:42 crc kubenswrapper[4779]: I0320 15:55:42.371722 4779 scope.go:117] "RemoveContainer" containerID="3816792d7f4a520eb194df5b57f19724f9da971e22f64b3a69c80ac94940dc7d" Mar 20 15:55:42 crc kubenswrapper[4779]: I0320 15:55:42.416220 4779 scope.go:117] "RemoveContainer" containerID="4f7bd0a4b4334182e687bc8ec8bf581d58b9f2a63b8cd261ed653ee89e3f6ba3" Mar 20 15:55:42 crc kubenswrapper[4779]: I0320 15:55:42.472972 4779 scope.go:117] "RemoveContainer" containerID="423c6d89080e00ce6b78d4e124a45d4b8c633d650eaea6de28b851a550f00412" Mar 20 15:55:42 crc kubenswrapper[4779]: I0320 15:55:42.509627 4779 scope.go:117] "RemoveContainer" containerID="1d66475bf2a6f89bf954aca5541d1b52ab4aebd06b56aea9ec4f52acf4b5260e" Mar 20 15:55:46 crc kubenswrapper[4779]: I0320 15:55:46.808420 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:55:46 crc kubenswrapper[4779]: E0320 15:55:46.809196 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 15:55:51 crc kubenswrapper[4779]: I0320 15:55:51.038354 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wjcm6"] Mar 20 15:55:51 crc kubenswrapper[4779]: I0320 15:55:51.052515 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wjcm6"] Mar 20 15:55:51 crc kubenswrapper[4779]: I0320 15:55:51.820091 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130a37da-c17e-48dd-8712-f87c67f01852" path="/var/lib/kubelet/pods/130a37da-c17e-48dd-8712-f87c67f01852/volumes" Mar 20 15:55:58 crc kubenswrapper[4779]: I0320 15:55:58.810562 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 15:55:59 crc kubenswrapper[4779]: I0320 15:55:59.231798 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"56c8e1388500b86731bed3431c69ab3383dde78a7cf6321fabe80d4cca0e9447"} Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.142421 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567036-m54tp"] Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.144545 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-m54tp" Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.147566 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.147963 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.148036 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.153345 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-m54tp"] Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.261480 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp6lg\" (UniqueName: \"kubernetes.io/projected/855d7064-da55-4450-8e6d-aced2cc5f7b2-kube-api-access-kp6lg\") pod \"auto-csr-approver-29567036-m54tp\" (UID: \"855d7064-da55-4450-8e6d-aced2cc5f7b2\") " pod="openshift-infra/auto-csr-approver-29567036-m54tp" Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.363266 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp6lg\" (UniqueName: \"kubernetes.io/projected/855d7064-da55-4450-8e6d-aced2cc5f7b2-kube-api-access-kp6lg\") pod \"auto-csr-approver-29567036-m54tp\" (UID: \"855d7064-da55-4450-8e6d-aced2cc5f7b2\") " pod="openshift-infra/auto-csr-approver-29567036-m54tp" Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.381915 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp6lg\" (UniqueName: \"kubernetes.io/projected/855d7064-da55-4450-8e6d-aced2cc5f7b2-kube-api-access-kp6lg\") pod \"auto-csr-approver-29567036-m54tp\" (UID: \"855d7064-da55-4450-8e6d-aced2cc5f7b2\") " pod="openshift-infra/auto-csr-approver-29567036-m54tp" Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.464135 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-m54tp" Mar 20 15:56:00 crc kubenswrapper[4779]: I0320 15:56:00.915208 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-m54tp"] Mar 20 15:56:01 crc kubenswrapper[4779]: I0320 15:56:01.253436 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-m54tp" event={"ID":"855d7064-da55-4450-8e6d-aced2cc5f7b2","Type":"ContainerStarted","Data":"48337036c5da84886aadaa0182de1971bf683dcb7c12be3a5d6985e26299c060"} Mar 20 15:56:03 crc kubenswrapper[4779]: I0320 15:56:03.271815 4779 generic.go:334] "Generic (PLEG): container finished" podID="855d7064-da55-4450-8e6d-aced2cc5f7b2" containerID="0120c6bb676edc98ec75f49a23ea355ef6a971b7bb06d5c0e9e8a79cf92f232e" exitCode=0 Mar 20 15:56:03 crc kubenswrapper[4779]: I0320 15:56:03.272024 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-m54tp" event={"ID":"855d7064-da55-4450-8e6d-aced2cc5f7b2","Type":"ContainerDied","Data":"0120c6bb676edc98ec75f49a23ea355ef6a971b7bb06d5c0e9e8a79cf92f232e"} Mar 20 15:56:04 crc kubenswrapper[4779]: I0320 15:56:04.596832 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-m54tp" Mar 20 15:56:04 crc kubenswrapper[4779]: I0320 15:56:04.784154 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp6lg\" (UniqueName: \"kubernetes.io/projected/855d7064-da55-4450-8e6d-aced2cc5f7b2-kube-api-access-kp6lg\") pod \"855d7064-da55-4450-8e6d-aced2cc5f7b2\" (UID: \"855d7064-da55-4450-8e6d-aced2cc5f7b2\") " Mar 20 15:56:04 crc kubenswrapper[4779]: I0320 15:56:04.793103 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855d7064-da55-4450-8e6d-aced2cc5f7b2-kube-api-access-kp6lg" (OuterVolumeSpecName: "kube-api-access-kp6lg") pod "855d7064-da55-4450-8e6d-aced2cc5f7b2" (UID: "855d7064-da55-4450-8e6d-aced2cc5f7b2"). InnerVolumeSpecName "kube-api-access-kp6lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:56:04 crc kubenswrapper[4779]: I0320 15:56:04.886002 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp6lg\" (UniqueName: \"kubernetes.io/projected/855d7064-da55-4450-8e6d-aced2cc5f7b2-kube-api-access-kp6lg\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:05 crc kubenswrapper[4779]: I0320 15:56:05.291163 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-m54tp" event={"ID":"855d7064-da55-4450-8e6d-aced2cc5f7b2","Type":"ContainerDied","Data":"48337036c5da84886aadaa0182de1971bf683dcb7c12be3a5d6985e26299c060"} Mar 20 15:56:05 crc kubenswrapper[4779]: I0320 15:56:05.291523 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48337036c5da84886aadaa0182de1971bf683dcb7c12be3a5d6985e26299c060" Mar 20 15:56:05 crc kubenswrapper[4779]: I0320 15:56:05.291215 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-m54tp" Mar 20 15:56:05 crc kubenswrapper[4779]: I0320 15:56:05.659872 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-vbjsm"] Mar 20 15:56:05 crc kubenswrapper[4779]: I0320 15:56:05.667906 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-vbjsm"] Mar 20 15:56:05 crc kubenswrapper[4779]: I0320 15:56:05.829007 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="153ea542-9d02-48ac-a53a-2fb66716e9a8" path="/var/lib/kubelet/pods/153ea542-9d02-48ac-a53a-2fb66716e9a8/volumes" Mar 20 15:56:23 crc kubenswrapper[4779]: I0320 15:56:23.453381 4779 generic.go:334] "Generic (PLEG): container finished" podID="0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb" containerID="3634ad5b6e8a04bf4f56852d6064c64a6e2e1ce46ebb110a881d2c81f5af9f40" exitCode=0 Mar 20 15:56:23 crc kubenswrapper[4779]: I0320 15:56:23.453466 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" event={"ID":"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb","Type":"ContainerDied","Data":"3634ad5b6e8a04bf4f56852d6064c64a6e2e1ce46ebb110a881d2c81f5af9f40"} Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.850634 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.876081 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-inventory\") pod \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.876569 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx5hq\" (UniqueName: \"kubernetes.io/projected/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-kube-api-access-qx5hq\") pod \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.876792 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-ssh-key-openstack-edpm-ipam\") pod \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\" (UID: \"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb\") " Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.887942 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-kube-api-access-qx5hq" (OuterVolumeSpecName: "kube-api-access-qx5hq") pod "0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb" (UID: "0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb"). InnerVolumeSpecName "kube-api-access-qx5hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.909171 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-inventory" (OuterVolumeSpecName: "inventory") pod "0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb" (UID: "0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.911395 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb" (UID: "0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.979405 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx5hq\" (UniqueName: \"kubernetes.io/projected/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-kube-api-access-qx5hq\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.979444 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:24 crc kubenswrapper[4779]: I0320 15:56:24.979454 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.471603 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" event={"ID":"0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb","Type":"ContainerDied","Data":"a9be95a235c1d6e948e131ae87e21a3738992017b28b28cae01678330268e988"} Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.471922 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9be95a235c1d6e948e131ae87e21a3738992017b28b28cae01678330268e988" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.471674 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76p96" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.585936 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj"] Mar 20 15:56:25 crc kubenswrapper[4779]: E0320 15:56:25.586416 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.586435 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:56:25 crc kubenswrapper[4779]: E0320 15:56:25.586451 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855d7064-da55-4450-8e6d-aced2cc5f7b2" containerName="oc" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.586459 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="855d7064-da55-4450-8e6d-aced2cc5f7b2" containerName="oc" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.586637 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="855d7064-da55-4450-8e6d-aced2cc5f7b2" containerName="oc" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.586655 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.587369 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.592394 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.592536 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.593039 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.593321 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.595138 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj"] Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.691671 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.691827 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mpbl\" (UniqueName: \"kubernetes.io/projected/277792af-615f-44a7-b90a-3b5041ce1aa2-kube-api-access-6mpbl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.691860 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.793641 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mpbl\" (UniqueName: \"kubernetes.io/projected/277792af-615f-44a7-b90a-3b5041ce1aa2-kube-api-access-6mpbl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.793693 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.793795 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.800579 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.801684 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.812360 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mpbl\" (UniqueName: \"kubernetes.io/projected/277792af-615f-44a7-b90a-3b5041ce1aa2-kube-api-access-6mpbl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7jplj\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:25 crc kubenswrapper[4779]: I0320 15:56:25.912469 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:26 crc kubenswrapper[4779]: I0320 15:56:26.416478 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj"] Mar 20 15:56:26 crc kubenswrapper[4779]: W0320 15:56:26.418606 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod277792af_615f_44a7_b90a_3b5041ce1aa2.slice/crio-fe69a0027157298265250f8dbc3640353b41a830434a292463262f126443dfb8 WatchSource:0}: Error finding container fe69a0027157298265250f8dbc3640353b41a830434a292463262f126443dfb8: Status 404 returned error can't find the container with id fe69a0027157298265250f8dbc3640353b41a830434a292463262f126443dfb8 Mar 20 15:56:26 crc kubenswrapper[4779]: I0320 15:56:26.482155 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" event={"ID":"277792af-615f-44a7-b90a-3b5041ce1aa2","Type":"ContainerStarted","Data":"fe69a0027157298265250f8dbc3640353b41a830434a292463262f126443dfb8"} Mar 20 15:56:27 crc kubenswrapper[4779]: I0320 15:56:27.491541 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" event={"ID":"277792af-615f-44a7-b90a-3b5041ce1aa2","Type":"ContainerStarted","Data":"ecd3a6f6aa8df355604e89c87a0e6adc2e663fea7211988bcd216c0b7ca225a0"} Mar 20 15:56:27 crc kubenswrapper[4779]: I0320 15:56:27.509918 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" podStartSLOduration=1.7899133790000001 podStartE2EDuration="2.50989861s" podCreationTimestamp="2026-03-20 15:56:25 +0000 UTC" firstStartedPulling="2026-03-20 15:56:26.420845484 +0000 UTC m=+2003.383361284" lastFinishedPulling="2026-03-20 15:56:27.140830715 +0000 UTC m=+2004.103346515" observedRunningTime="2026-03-20 15:56:27.507016029 +0000 UTC m=+2004.469531829" watchObservedRunningTime="2026-03-20 15:56:27.50989861 +0000 UTC m=+2004.472414410" Mar 20 15:56:32 crc kubenswrapper[4779]: I0320 15:56:32.533192 4779 generic.go:334] "Generic (PLEG): container finished" podID="277792af-615f-44a7-b90a-3b5041ce1aa2" containerID="ecd3a6f6aa8df355604e89c87a0e6adc2e663fea7211988bcd216c0b7ca225a0" exitCode=0 Mar 20 15:56:32 crc kubenswrapper[4779]: I0320 15:56:32.533282 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" event={"ID":"277792af-615f-44a7-b90a-3b5041ce1aa2","Type":"ContainerDied","Data":"ecd3a6f6aa8df355604e89c87a0e6adc2e663fea7211988bcd216c0b7ca225a0"} Mar 20 15:56:33 crc kubenswrapper[4779]: I0320 15:56:33.945012 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.059809 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mpbl\" (UniqueName: \"kubernetes.io/projected/277792af-615f-44a7-b90a-3b5041ce1aa2-kube-api-access-6mpbl\") pod \"277792af-615f-44a7-b90a-3b5041ce1aa2\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.059861 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-inventory\") pod \"277792af-615f-44a7-b90a-3b5041ce1aa2\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.059958 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-ssh-key-openstack-edpm-ipam\") pod \"277792af-615f-44a7-b90a-3b5041ce1aa2\" (UID: \"277792af-615f-44a7-b90a-3b5041ce1aa2\") " Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.065335 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277792af-615f-44a7-b90a-3b5041ce1aa2-kube-api-access-6mpbl" (OuterVolumeSpecName: "kube-api-access-6mpbl") pod "277792af-615f-44a7-b90a-3b5041ce1aa2" (UID: "277792af-615f-44a7-b90a-3b5041ce1aa2"). InnerVolumeSpecName "kube-api-access-6mpbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.086212 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-inventory" (OuterVolumeSpecName: "inventory") pod "277792af-615f-44a7-b90a-3b5041ce1aa2" (UID: "277792af-615f-44a7-b90a-3b5041ce1aa2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.091552 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "277792af-615f-44a7-b90a-3b5041ce1aa2" (UID: "277792af-615f-44a7-b90a-3b5041ce1aa2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.162248 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mpbl\" (UniqueName: \"kubernetes.io/projected/277792af-615f-44a7-b90a-3b5041ce1aa2-kube-api-access-6mpbl\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.162286 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.162296 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/277792af-615f-44a7-b90a-3b5041ce1aa2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.551127 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" event={"ID":"277792af-615f-44a7-b90a-3b5041ce1aa2","Type":"ContainerDied","Data":"fe69a0027157298265250f8dbc3640353b41a830434a292463262f126443dfb8"} Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.551172 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe69a0027157298265250f8dbc3640353b41a830434a292463262f126443dfb8" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.551175 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7jplj" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.625367 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88"] Mar 20 15:56:34 crc kubenswrapper[4779]: E0320 15:56:34.625879 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277792af-615f-44a7-b90a-3b5041ce1aa2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.625901 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="277792af-615f-44a7-b90a-3b5041ce1aa2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.626163 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="277792af-615f-44a7-b90a-3b5041ce1aa2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.626974 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.632191 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.632401 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.632498 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.632518 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.647712 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88"] Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.672322 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9z7q\" (UniqueName: \"kubernetes.io/projected/b412995c-cebc-47ec-8ba2-9644a0f65c18-kube-api-access-v9z7q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.672459 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.672536 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.773627 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.773718 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.773805 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9z7q\" (UniqueName: \"kubernetes.io/projected/b412995c-cebc-47ec-8ba2-9644a0f65c18-kube-api-access-v9z7q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.778248 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.780872 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.798858 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9z7q\" (UniqueName: \"kubernetes.io/projected/b412995c-cebc-47ec-8ba2-9644a0f65c18-kube-api-access-v9z7q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p6d88\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:34 crc kubenswrapper[4779]: I0320 15:56:34.958414 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:56:35 crc kubenswrapper[4779]: I0320 15:56:35.440643 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88"] Mar 20 15:56:35 crc kubenswrapper[4779]: I0320 15:56:35.560687 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" event={"ID":"b412995c-cebc-47ec-8ba2-9644a0f65c18","Type":"ContainerStarted","Data":"ca13e7e00237bd2fa606862378e841d5527ce3718bde43c3f9398c76ffb0cdde"} Mar 20 15:56:36 crc kubenswrapper[4779]: I0320 15:56:36.571267 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" event={"ID":"b412995c-cebc-47ec-8ba2-9644a0f65c18","Type":"ContainerStarted","Data":"4097e08ddb4f5400d8a1f15c53ef338c6f2ab6690a5354fd8797c698ed2a96a8"} Mar 20 15:56:36 crc kubenswrapper[4779]: I0320 15:56:36.601241 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" podStartSLOduration=2.139488508 podStartE2EDuration="2.601223175s" podCreationTimestamp="2026-03-20 15:56:34 +0000 UTC" firstStartedPulling="2026-03-20 15:56:35.445638028 +0000 UTC m=+2012.408153828" lastFinishedPulling="2026-03-20 15:56:35.907372695 +0000 UTC m=+2012.869888495" observedRunningTime="2026-03-20 15:56:36.591193627 +0000 UTC m=+2013.553709437" watchObservedRunningTime="2026-03-20 15:56:36.601223175 +0000 UTC m=+2013.563738975" Mar 20 15:56:39 crc kubenswrapper[4779]: I0320 15:56:39.038141 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0bdf-account-create-update-58t5l"] Mar 20 15:56:39 crc kubenswrapper[4779]: I0320 15:56:39.048512 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0bdf-account-create-update-58t5l"] Mar 20 15:56:39 crc kubenswrapper[4779]: I0320 15:56:39.819776 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fcb9b6b-ec11-4073-aef1-2ff8cf21545f" path="/var/lib/kubelet/pods/6fcb9b6b-ec11-4073-aef1-2ff8cf21545f/volumes" Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.034207 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-016f-account-create-update-qpvhj"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.044086 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d9cf-account-create-update-4rn7w"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.053215 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-016f-account-create-update-qpvhj"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.064618 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-rlrzh"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.077329 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rx87f"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.085966 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d9cf-account-create-update-4rn7w"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.093753 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-22qhc"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.101845 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-rlrzh"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.109358 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rx87f"] Mar 20 15:56:40 crc kubenswrapper[4779]: I0320 15:56:40.118210 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-22qhc"] Mar 20 15:56:41 crc kubenswrapper[4779]: I0320 15:56:41.819324 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0331a592-114a-4ab4-8c73-f01b650f50f4" path="/var/lib/kubelet/pods/0331a592-114a-4ab4-8c73-f01b650f50f4/volumes" Mar 20 15:56:41 crc kubenswrapper[4779]: I0320 15:56:41.820437 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84fff0c4-e9cf-4459-a31e-c6dbb2f58096" path="/var/lib/kubelet/pods/84fff0c4-e9cf-4459-a31e-c6dbb2f58096/volumes" Mar 20 15:56:41 crc kubenswrapper[4779]: I0320 15:56:41.821197 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8ef970-d33e-4918-9b19-a0dc472f63f1" path="/var/lib/kubelet/pods/8d8ef970-d33e-4918-9b19-a0dc472f63f1/volumes" Mar 20 15:56:41 crc kubenswrapper[4779]: I0320 15:56:41.821861 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95cc2ea8-9931-4f65-9691-71bf32261ebb" path="/var/lib/kubelet/pods/95cc2ea8-9931-4f65-9691-71bf32261ebb/volumes" Mar 20 15:56:41 crc kubenswrapper[4779]: I0320 15:56:41.823174 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d867d7-f83c-43e1-ac12-4ebd4814b2b7" path="/var/lib/kubelet/pods/a7d867d7-f83c-43e1-ac12-4ebd4814b2b7/volumes" Mar 20 15:56:42 crc kubenswrapper[4779]: I0320 15:56:42.659537 4779 scope.go:117] "RemoveContainer" containerID="f1b4871a28aa275fa3995a1a419e178a6ee58b083aa0f9d572b3bb2150f489d7" Mar 20 15:56:42 crc kubenswrapper[4779]: I0320 15:56:42.700712 4779 scope.go:117] "RemoveContainer" containerID="6eb9c7558338a2a8f9403b1cadb9a0568c465b9179578393cd5b53ac4c53cf15" Mar 20 15:56:42 crc kubenswrapper[4779]: I0320 15:56:42.734011 4779 scope.go:117] "RemoveContainer" containerID="601bd00bd22831031142ba0f3e1204a8158fd9291e561783b6b5e1dda5b875ba" Mar 20 15:56:42 crc kubenswrapper[4779]: I0320 15:56:42.776706 4779 scope.go:117] "RemoveContainer" containerID="63e890f90cfcb8a844f2bbb13f038b655127b921c80b0676eb0cf362946ce15c" Mar 20 15:56:42 crc kubenswrapper[4779]: I0320 15:56:42.821076 4779 scope.go:117] "RemoveContainer" containerID="013d676dfc533185f0a66460e192ae1fb4c5d234181ba36c0fd2bae06aeb6c59" Mar 20 15:56:42 crc kubenswrapper[4779]: I0320 15:56:42.862281 4779 scope.go:117] "RemoveContainer" containerID="cf4939e2333b84deca27326e90ca65eb7d4187f312440e079e535607e21cdb19" Mar 20 15:56:42 crc kubenswrapper[4779]: I0320 15:56:42.906120 4779 scope.go:117] "RemoveContainer" containerID="c5930c8216183733a9a5dcfbda76408f7a45077793a538dd9b4f5acbedaf830b" Mar 20 15:56:42 crc kubenswrapper[4779]: I0320 15:56:42.924301 4779 scope.go:117] "RemoveContainer" containerID="5e89d3a4cbd0626117310c3f7cacc52a9b5a84cc4f4a764f17de36df44024385" Mar 20 15:57:09 crc kubenswrapper[4779]: I0320 15:57:09.040757 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5gc7l"] Mar 20 15:57:09 crc kubenswrapper[4779]: I0320 15:57:09.057390 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5gc7l"] Mar 20 15:57:09 crc kubenswrapper[4779]: I0320 15:57:09.820508 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003b9556-88db-4e20-8371-3863a2dd44b8" path="/var/lib/kubelet/pods/003b9556-88db-4e20-8371-3863a2dd44b8/volumes" Mar 20 15:57:13 crc kubenswrapper[4779]: I0320 15:57:13.873698 4779 generic.go:334] "Generic (PLEG): container finished" podID="b412995c-cebc-47ec-8ba2-9644a0f65c18" containerID="4097e08ddb4f5400d8a1f15c53ef338c6f2ab6690a5354fd8797c698ed2a96a8" exitCode=0 Mar 20 15:57:13 crc kubenswrapper[4779]: I0320 15:57:13.873778 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" event={"ID":"b412995c-cebc-47ec-8ba2-9644a0f65c18","Type":"ContainerDied","Data":"4097e08ddb4f5400d8a1f15c53ef338c6f2ab6690a5354fd8797c698ed2a96a8"} Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.393402 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.517281 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-inventory\") pod \"b412995c-cebc-47ec-8ba2-9644a0f65c18\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.517507 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9z7q\" (UniqueName: \"kubernetes.io/projected/b412995c-cebc-47ec-8ba2-9644a0f65c18-kube-api-access-v9z7q\") pod \"b412995c-cebc-47ec-8ba2-9644a0f65c18\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.517568 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-ssh-key-openstack-edpm-ipam\") pod \"b412995c-cebc-47ec-8ba2-9644a0f65c18\" (UID: \"b412995c-cebc-47ec-8ba2-9644a0f65c18\") " Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.523368 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b412995c-cebc-47ec-8ba2-9644a0f65c18-kube-api-access-v9z7q" (OuterVolumeSpecName: "kube-api-access-v9z7q") pod "b412995c-cebc-47ec-8ba2-9644a0f65c18" (UID: "b412995c-cebc-47ec-8ba2-9644a0f65c18"). InnerVolumeSpecName "kube-api-access-v9z7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.546097 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-inventory" (OuterVolumeSpecName: "inventory") pod "b412995c-cebc-47ec-8ba2-9644a0f65c18" (UID: "b412995c-cebc-47ec-8ba2-9644a0f65c18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.562470 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b412995c-cebc-47ec-8ba2-9644a0f65c18" (UID: "b412995c-cebc-47ec-8ba2-9644a0f65c18"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.619530 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9z7q\" (UniqueName: \"kubernetes.io/projected/b412995c-cebc-47ec-8ba2-9644a0f65c18-kube-api-access-v9z7q\") on node \"crc\" DevicePath \"\"" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.619562 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.619572 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b412995c-cebc-47ec-8ba2-9644a0f65c18-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.894501 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" event={"ID":"b412995c-cebc-47ec-8ba2-9644a0f65c18","Type":"ContainerDied","Data":"ca13e7e00237bd2fa606862378e841d5527ce3718bde43c3f9398c76ffb0cdde"} Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.894547 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca13e7e00237bd2fa606862378e841d5527ce3718bde43c3f9398c76ffb0cdde" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.894550 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p6d88" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.992284 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf"] Mar 20 15:57:15 crc kubenswrapper[4779]: E0320 15:57:15.992767 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b412995c-cebc-47ec-8ba2-9644a0f65c18" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.992792 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b412995c-cebc-47ec-8ba2-9644a0f65c18" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.993147 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b412995c-cebc-47ec-8ba2-9644a0f65c18" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:57:15 crc kubenswrapper[4779]: I0320 15:57:15.994011 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.005689 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf"] Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.006825 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.006891 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.007176 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.007486 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.129868 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.130041 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx82f\" (UniqueName: \"kubernetes.io/projected/5181a11a-08d8-4440-acc3-c8e6c65edded-kube-api-access-bx82f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.130237 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.232028 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx82f\" (UniqueName: \"kubernetes.io/projected/5181a11a-08d8-4440-acc3-c8e6c65edded-kube-api-access-bx82f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.232179 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.232278 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.237798 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.237838 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.253080 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx82f\" (UniqueName: \"kubernetes.io/projected/5181a11a-08d8-4440-acc3-c8e6c65edded-kube-api-access-bx82f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.341962 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.852794 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf"] Mar 20 15:57:16 crc kubenswrapper[4779]: I0320 15:57:16.903587 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" event={"ID":"5181a11a-08d8-4440-acc3-c8e6c65edded","Type":"ContainerStarted","Data":"7612c10e459af1e4c94587d60613c2e9fe6a41f2fdbd6871ec6fddd947a218ca"} Mar 20 15:57:17 crc kubenswrapper[4779]: I0320 15:57:17.913844 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" event={"ID":"5181a11a-08d8-4440-acc3-c8e6c65edded","Type":"ContainerStarted","Data":"a27df803ef8941036ac39b52c1e6ee8acb01a234b2e513e13267ba913aca12a7"} Mar 20 15:57:33 crc kubenswrapper[4779]: I0320 15:57:33.049973 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" podStartSLOduration=17.578292174 podStartE2EDuration="18.049947655s" podCreationTimestamp="2026-03-20 15:57:15 +0000 UTC" firstStartedPulling="2026-03-20 15:57:16.857930301 +0000 UTC m=+2053.820446101" lastFinishedPulling="2026-03-20 15:57:17.329585782 +0000 UTC m=+2054.292101582" observedRunningTime="2026-03-20 15:57:17.935454913 +0000 UTC m=+2054.897970713" watchObservedRunningTime="2026-03-20 15:57:33.049947655 +0000 UTC m=+2070.012463455" Mar 20 15:57:33 crc kubenswrapper[4779]: I0320 15:57:33.051908 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-wrqkc"] Mar 20 15:57:33 crc kubenswrapper[4779]: I0320 15:57:33.063038 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-wrqkc"] Mar 20 15:57:33 crc kubenswrapper[4779]: I0320 15:57:33.819512 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e379766f-391c-4e7f-8a69-84d1b624d88b" path="/var/lib/kubelet/pods/e379766f-391c-4e7f-8a69-84d1b624d88b/volumes" Mar 20 15:57:34 crc kubenswrapper[4779]: I0320 15:57:34.030167 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dzzmf"] Mar 20 15:57:34 crc kubenswrapper[4779]: I0320 15:57:34.036139 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dzzmf"] Mar 20 15:57:35 crc kubenswrapper[4779]: I0320 15:57:35.830016 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c50037-e945-4f6a-8e49-20b4a969a77e" path="/var/lib/kubelet/pods/26c50037-e945-4f6a-8e49-20b4a969a77e/volumes" Mar 20 15:57:43 crc kubenswrapper[4779]: I0320 15:57:43.085755 4779 scope.go:117] "RemoveContainer" containerID="6d5eb0a2ade93dca914e2a3209548039f10656ad01e8f3b622f6f1d36865e42c" Mar 20 15:57:43 crc kubenswrapper[4779]: I0320 15:57:43.134467 4779 scope.go:117] "RemoveContainer" containerID="08368ff48bb95bff64cc68c56cc8e79d1b18031a02cc9749742ca2395f65d492" Mar 20 15:57:43 crc kubenswrapper[4779]: I0320 15:57:43.187772 4779 scope.go:117] "RemoveContainer" containerID="c6f752d508e988eaab31e57b3e593cfcccece950051706320499670d5c9c74d2" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.262419 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ck6tj"] Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.266103 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.271887 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ck6tj"] Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.394299 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbnj\" (UniqueName: \"kubernetes.io/projected/9d564261-3696-4ee0-8945-e28c8082c1e5-kube-api-access-8cbnj\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.394384 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-utilities\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.394620 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-catalog-content\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.496555 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbnj\" (UniqueName: \"kubernetes.io/projected/9d564261-3696-4ee0-8945-e28c8082c1e5-kube-api-access-8cbnj\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.496670 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-utilities\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.496731 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-catalog-content\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.497334 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-catalog-content\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.497333 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-utilities\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.530613 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbnj\" (UniqueName: \"kubernetes.io/projected/9d564261-3696-4ee0-8945-e28c8082c1e5-kube-api-access-8cbnj\") pod \"redhat-operators-ck6tj\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:49 crc kubenswrapper[4779]: I0320 15:57:49.594516 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:50 crc kubenswrapper[4779]: I0320 15:57:50.083652 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ck6tj"] Mar 20 15:57:50 crc kubenswrapper[4779]: I0320 15:57:50.174659 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck6tj" event={"ID":"9d564261-3696-4ee0-8945-e28c8082c1e5","Type":"ContainerStarted","Data":"d11f4042f7adb5eebdacb1ff926d94b7c8451dd797343a2ce9c17cdac3bc9e5d"} Mar 20 15:57:51 crc kubenswrapper[4779]: I0320 15:57:51.183943 4779 generic.go:334] "Generic (PLEG): container finished" podID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerID="1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6" exitCode=0 Mar 20 15:57:51 crc kubenswrapper[4779]: I0320 15:57:51.184017 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck6tj" event={"ID":"9d564261-3696-4ee0-8945-e28c8082c1e5","Type":"ContainerDied","Data":"1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6"} Mar 20 15:57:52 crc kubenswrapper[4779]: I0320 15:57:52.193586 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck6tj" event={"ID":"9d564261-3696-4ee0-8945-e28c8082c1e5","Type":"ContainerStarted","Data":"e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605"} Mar 20 15:57:57 crc kubenswrapper[4779]: I0320 15:57:57.230289 4779 generic.go:334] "Generic (PLEG): container finished" podID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerID="e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605" exitCode=0 Mar 20 15:57:57 crc kubenswrapper[4779]: I0320 15:57:57.230363 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck6tj" event={"ID":"9d564261-3696-4ee0-8945-e28c8082c1e5","Type":"ContainerDied","Data":"e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605"} Mar 20 15:57:58 crc kubenswrapper[4779]: I0320 15:57:58.242644 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck6tj" event={"ID":"9d564261-3696-4ee0-8945-e28c8082c1e5","Type":"ContainerStarted","Data":"2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6"} Mar 20 15:57:58 crc kubenswrapper[4779]: I0320 15:57:58.264494 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ck6tj" podStartSLOduration=2.818805892 podStartE2EDuration="9.264473372s" podCreationTimestamp="2026-03-20 15:57:49 +0000 UTC" firstStartedPulling="2026-03-20 15:57:51.186426878 +0000 UTC m=+2088.148942678" lastFinishedPulling="2026-03-20 15:57:57.632094358 +0000 UTC m=+2094.594610158" observedRunningTime="2026-03-20 15:57:58.261019257 +0000 UTC m=+2095.223535057" watchObservedRunningTime="2026-03-20 15:57:58.264473372 +0000 UTC m=+2095.226989182" Mar 20 15:57:59 crc kubenswrapper[4779]: I0320 15:57:59.595191 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:57:59 crc kubenswrapper[4779]: I0320 15:57:59.595269 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.150769 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567038-hpjgt"] Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.152815 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.154822 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.155185 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.156092 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.160767 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-hpjgt"] Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.307500 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcm72\" (UniqueName: \"kubernetes.io/projected/467d5500-9158-43cd-a096-986a0eb3319a-kube-api-access-gcm72\") pod \"auto-csr-approver-29567038-hpjgt\" (UID: \"467d5500-9158-43cd-a096-986a0eb3319a\") " pod="openshift-infra/auto-csr-approver-29567038-hpjgt" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.409814 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcm72\" (UniqueName: \"kubernetes.io/projected/467d5500-9158-43cd-a096-986a0eb3319a-kube-api-access-gcm72\") pod \"auto-csr-approver-29567038-hpjgt\" (UID: \"467d5500-9158-43cd-a096-986a0eb3319a\") " pod="openshift-infra/auto-csr-approver-29567038-hpjgt" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.437740 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcm72\" (UniqueName: \"kubernetes.io/projected/467d5500-9158-43cd-a096-986a0eb3319a-kube-api-access-gcm72\") pod \"auto-csr-approver-29567038-hpjgt\" (UID: \"467d5500-9158-43cd-a096-986a0eb3319a\") " pod="openshift-infra/auto-csr-approver-29567038-hpjgt" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.472421 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.671315 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ck6tj" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="registry-server" probeResult="failure" output=< Mar 20 15:58:00 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 15:58:00 crc kubenswrapper[4779]: > Mar 20 15:58:00 crc kubenswrapper[4779]: I0320 15:58:00.954741 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-hpjgt"] Mar 20 15:58:01 crc kubenswrapper[4779]: I0320 15:58:01.286377 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" event={"ID":"467d5500-9158-43cd-a096-986a0eb3319a","Type":"ContainerStarted","Data":"758cbfeab7846d1ef5ca4ed5ceddc9b98efa6558fe1a1dbd8c74181793488071"} Mar 20 15:58:02 crc kubenswrapper[4779]: I0320 15:58:02.297330 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" event={"ID":"467d5500-9158-43cd-a096-986a0eb3319a","Type":"ContainerStarted","Data":"cd409a584cc4406249ba8623c15bbfee8fafd2141218e43845b7fa65ee7b789f"} Mar 20 15:58:02 crc kubenswrapper[4779]: I0320 15:58:02.314753 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" podStartSLOduration=1.4460674 podStartE2EDuration="2.314734351s" podCreationTimestamp="2026-03-20 15:58:00 +0000 UTC" firstStartedPulling="2026-03-20 15:58:00.963204723 +0000 UTC m=+2097.925720523" lastFinishedPulling="2026-03-20 15:58:01.831871674 +0000 UTC m=+2098.794387474" observedRunningTime="2026-03-20 15:58:02.311199184 +0000 UTC m=+2099.273714984" watchObservedRunningTime="2026-03-20 15:58:02.314734351 +0000 UTC m=+2099.277250151" Mar 20 15:58:03 crc kubenswrapper[4779]: I0320 15:58:03.306372 4779 generic.go:334] "Generic (PLEG): container finished" podID="467d5500-9158-43cd-a096-986a0eb3319a" containerID="cd409a584cc4406249ba8623c15bbfee8fafd2141218e43845b7fa65ee7b789f" exitCode=0 Mar 20 15:58:03 crc kubenswrapper[4779]: I0320 15:58:03.306468 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" event={"ID":"467d5500-9158-43cd-a096-986a0eb3319a","Type":"ContainerDied","Data":"cd409a584cc4406249ba8623c15bbfee8fafd2141218e43845b7fa65ee7b789f"} Mar 20 15:58:04 crc kubenswrapper[4779]: I0320 15:58:04.636728 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" Mar 20 15:58:04 crc kubenswrapper[4779]: I0320 15:58:04.792247 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcm72\" (UniqueName: \"kubernetes.io/projected/467d5500-9158-43cd-a096-986a0eb3319a-kube-api-access-gcm72\") pod \"467d5500-9158-43cd-a096-986a0eb3319a\" (UID: \"467d5500-9158-43cd-a096-986a0eb3319a\") " Mar 20 15:58:04 crc kubenswrapper[4779]: I0320 15:58:04.798564 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/467d5500-9158-43cd-a096-986a0eb3319a-kube-api-access-gcm72" (OuterVolumeSpecName: "kube-api-access-gcm72") pod "467d5500-9158-43cd-a096-986a0eb3319a" (UID: "467d5500-9158-43cd-a096-986a0eb3319a"). InnerVolumeSpecName "kube-api-access-gcm72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:04 crc kubenswrapper[4779]: I0320 15:58:04.895035 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcm72\" (UniqueName: \"kubernetes.io/projected/467d5500-9158-43cd-a096-986a0eb3319a-kube-api-access-gcm72\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:05 crc kubenswrapper[4779]: I0320 15:58:05.325995 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" event={"ID":"467d5500-9158-43cd-a096-986a0eb3319a","Type":"ContainerDied","Data":"758cbfeab7846d1ef5ca4ed5ceddc9b98efa6558fe1a1dbd8c74181793488071"} Mar 20 15:58:05 crc kubenswrapper[4779]: I0320 15:58:05.326378 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758cbfeab7846d1ef5ca4ed5ceddc9b98efa6558fe1a1dbd8c74181793488071" Mar 20 15:58:05 crc kubenswrapper[4779]: I0320 15:58:05.326050 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-hpjgt" Mar 20 15:58:05 crc kubenswrapper[4779]: I0320 15:58:05.328628 4779 generic.go:334] "Generic (PLEG): container finished" podID="5181a11a-08d8-4440-acc3-c8e6c65edded" containerID="a27df803ef8941036ac39b52c1e6ee8acb01a234b2e513e13267ba913aca12a7" exitCode=0 Mar 20 15:58:05 crc kubenswrapper[4779]: I0320 15:58:05.328677 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" event={"ID":"5181a11a-08d8-4440-acc3-c8e6c65edded","Type":"ContainerDied","Data":"a27df803ef8941036ac39b52c1e6ee8acb01a234b2e513e13267ba913aca12a7"} Mar 20 15:58:05 crc kubenswrapper[4779]: I0320 15:58:05.387071 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-md578"] Mar 20 15:58:05 crc kubenswrapper[4779]: I0320 15:58:05.394905 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-md578"] Mar 20 15:58:05 crc kubenswrapper[4779]: I0320 15:58:05.824167 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eeef9a4-9be4-4067-a699-d2503ce4acc7" path="/var/lib/kubelet/pods/4eeef9a4-9be4-4067-a699-d2503ce4acc7/volumes" Mar 20 15:58:06 crc kubenswrapper[4779]: I0320 15:58:06.902360 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.040640 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-inventory\") pod \"5181a11a-08d8-4440-acc3-c8e6c65edded\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.040859 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-ssh-key-openstack-edpm-ipam\") pod \"5181a11a-08d8-4440-acc3-c8e6c65edded\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.040901 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx82f\" (UniqueName: \"kubernetes.io/projected/5181a11a-08d8-4440-acc3-c8e6c65edded-kube-api-access-bx82f\") pod \"5181a11a-08d8-4440-acc3-c8e6c65edded\" (UID: \"5181a11a-08d8-4440-acc3-c8e6c65edded\") " Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.045709 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5181a11a-08d8-4440-acc3-c8e6c65edded-kube-api-access-bx82f" (OuterVolumeSpecName: "kube-api-access-bx82f") pod "5181a11a-08d8-4440-acc3-c8e6c65edded" (UID: "5181a11a-08d8-4440-acc3-c8e6c65edded"). InnerVolumeSpecName "kube-api-access-bx82f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.073632 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-inventory" (OuterVolumeSpecName: "inventory") pod "5181a11a-08d8-4440-acc3-c8e6c65edded" (UID: "5181a11a-08d8-4440-acc3-c8e6c65edded"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.073957 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5181a11a-08d8-4440-acc3-c8e6c65edded" (UID: "5181a11a-08d8-4440-acc3-c8e6c65edded"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.143686 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.143736 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5181a11a-08d8-4440-acc3-c8e6c65edded-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.143751 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx82f\" (UniqueName: \"kubernetes.io/projected/5181a11a-08d8-4440-acc3-c8e6c65edded-kube-api-access-bx82f\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.346031 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" event={"ID":"5181a11a-08d8-4440-acc3-c8e6c65edded","Type":"ContainerDied","Data":"7612c10e459af1e4c94587d60613c2e9fe6a41f2fdbd6871ec6fddd947a218ca"} Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.346078 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7612c10e459af1e4c94587d60613c2e9fe6a41f2fdbd6871ec6fddd947a218ca" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.346191 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.422577 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zggqb"] Mar 20 15:58:07 crc kubenswrapper[4779]: E0320 15:58:07.423039 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5181a11a-08d8-4440-acc3-c8e6c65edded" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.423055 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="5181a11a-08d8-4440-acc3-c8e6c65edded" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:07 crc kubenswrapper[4779]: E0320 15:58:07.423065 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467d5500-9158-43cd-a096-986a0eb3319a" containerName="oc" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.423071 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="467d5500-9158-43cd-a096-986a0eb3319a" containerName="oc" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.423275 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="5181a11a-08d8-4440-acc3-c8e6c65edded" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.423290 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="467d5500-9158-43cd-a096-986a0eb3319a" containerName="oc" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.424174 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.427650 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.427865 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.428084 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.428352 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.439310 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zggqb"] Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.559789 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cfgt\" (UniqueName: \"kubernetes.io/projected/a81a28a7-b38a-4491-ba2a-7de988a7e02f-kube-api-access-9cfgt\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.559848 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.559971 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.661205 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cfgt\" (UniqueName: \"kubernetes.io/projected/a81a28a7-b38a-4491-ba2a-7de988a7e02f-kube-api-access-9cfgt\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.661259 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.661359 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.665412 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.666664 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.694902 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cfgt\" (UniqueName: \"kubernetes.io/projected/a81a28a7-b38a-4491-ba2a-7de988a7e02f-kube-api-access-9cfgt\") pod \"ssh-known-hosts-edpm-deployment-zggqb\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:07 crc kubenswrapper[4779]: I0320 15:58:07.739722 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:08 crc kubenswrapper[4779]: I0320 15:58:08.097205 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zggqb"] Mar 20 15:58:08 crc kubenswrapper[4779]: I0320 15:58:08.354680 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" event={"ID":"a81a28a7-b38a-4491-ba2a-7de988a7e02f","Type":"ContainerStarted","Data":"bae15ad9add47ef1dbd62957ca520e7d56d4372d617fd4d1b9c7c84ab0149415"} Mar 20 15:58:09 crc kubenswrapper[4779]: I0320 15:58:09.376990 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" event={"ID":"a81a28a7-b38a-4491-ba2a-7de988a7e02f","Type":"ContainerStarted","Data":"114634745a66c7210dc0c263f631a43d3482704a7217daa2ced9da13e8fe48a8"} Mar 20 15:58:09 crc kubenswrapper[4779]: I0320 15:58:09.394220 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" podStartSLOduration=1.879039396 podStartE2EDuration="2.39420476s" podCreationTimestamp="2026-03-20 15:58:07 +0000 UTC" firstStartedPulling="2026-03-20 15:58:08.092023278 +0000 UTC m=+2105.054539078" lastFinishedPulling="2026-03-20 15:58:08.607188632 +0000 UTC m=+2105.569704442" observedRunningTime="2026-03-20 15:58:09.392357835 +0000 UTC m=+2106.354873635" watchObservedRunningTime="2026-03-20 15:58:09.39420476 +0000 UTC m=+2106.356720560" Mar 20 15:58:09 crc kubenswrapper[4779]: I0320 15:58:09.647315 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:58:09 crc kubenswrapper[4779]: I0320 15:58:09.690362 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:58:09 crc kubenswrapper[4779]: I0320 15:58:09.878892 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ck6tj"] Mar 20 15:58:11 crc kubenswrapper[4779]: I0320 15:58:11.391720 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ck6tj" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="registry-server" containerID="cri-o://2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6" gracePeriod=2 Mar 20 15:58:11 crc kubenswrapper[4779]: I0320 15:58:11.842367 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:58:11 crc kubenswrapper[4779]: I0320 15:58:11.953296 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-catalog-content\") pod \"9d564261-3696-4ee0-8945-e28c8082c1e5\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " Mar 20 15:58:11 crc kubenswrapper[4779]: I0320 15:58:11.953412 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-utilities\") pod \"9d564261-3696-4ee0-8945-e28c8082c1e5\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " Mar 20 15:58:11 crc kubenswrapper[4779]: I0320 15:58:11.953522 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cbnj\" (UniqueName: \"kubernetes.io/projected/9d564261-3696-4ee0-8945-e28c8082c1e5-kube-api-access-8cbnj\") pod \"9d564261-3696-4ee0-8945-e28c8082c1e5\" (UID: \"9d564261-3696-4ee0-8945-e28c8082c1e5\") " Mar 20 15:58:11 crc kubenswrapper[4779]: I0320 15:58:11.954919 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-utilities" (OuterVolumeSpecName: "utilities") pod "9d564261-3696-4ee0-8945-e28c8082c1e5" (UID: "9d564261-3696-4ee0-8945-e28c8082c1e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:58:11 crc kubenswrapper[4779]: I0320 15:58:11.968388 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d564261-3696-4ee0-8945-e28c8082c1e5-kube-api-access-8cbnj" (OuterVolumeSpecName: "kube-api-access-8cbnj") pod "9d564261-3696-4ee0-8945-e28c8082c1e5" (UID: "9d564261-3696-4ee0-8945-e28c8082c1e5"). InnerVolumeSpecName "kube-api-access-8cbnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.056613 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cbnj\" (UniqueName: \"kubernetes.io/projected/9d564261-3696-4ee0-8945-e28c8082c1e5-kube-api-access-8cbnj\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.056658 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.083708 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d564261-3696-4ee0-8945-e28c8082c1e5" (UID: "9d564261-3696-4ee0-8945-e28c8082c1e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.158383 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d564261-3696-4ee0-8945-e28c8082c1e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.406356 4779 generic.go:334] "Generic (PLEG): container finished" podID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerID="2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6" exitCode=0 Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.406515 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ck6tj" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.406560 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck6tj" event={"ID":"9d564261-3696-4ee0-8945-e28c8082c1e5","Type":"ContainerDied","Data":"2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6"} Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.407032 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck6tj" event={"ID":"9d564261-3696-4ee0-8945-e28c8082c1e5","Type":"ContainerDied","Data":"d11f4042f7adb5eebdacb1ff926d94b7c8451dd797343a2ce9c17cdac3bc9e5d"} Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.407067 4779 scope.go:117] "RemoveContainer" containerID="2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.431474 4779 scope.go:117] "RemoveContainer" containerID="e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.450854 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ck6tj"] Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.476607 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ck6tj"] Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.476973 4779 scope.go:117] "RemoveContainer" containerID="1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.538895 4779 scope.go:117] "RemoveContainer" containerID="2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6" Mar 20 15:58:12 crc kubenswrapper[4779]: E0320 15:58:12.539426 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6\": container with ID starting with 2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6 not found: ID does not exist" containerID="2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.539503 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6"} err="failed to get container status \"2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6\": rpc error: code = NotFound desc = could not find container \"2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6\": container with ID starting with 2473067d913c3e241f5c4dc9036bce13d1fcc4f0a475db440342b4f903104bd6 not found: ID does not exist" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.539552 4779 scope.go:117] "RemoveContainer" containerID="e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605" Mar 20 15:58:12 crc kubenswrapper[4779]: E0320 15:58:12.540671 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605\": container with ID starting with e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605 not found: ID does not exist" containerID="e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.540708 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605"} err="failed to get container status \"e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605\": rpc error: code = NotFound desc = could not find container \"e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605\": container with ID starting with e2e8766992b1084dae814eeb716871ecaad90ca508743bec8cd72012b5bd0605 not found: ID does not exist" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.540735 4779 scope.go:117] "RemoveContainer" containerID="1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6" Mar 20 15:58:12 crc kubenswrapper[4779]: E0320 15:58:12.541099 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6\": container with ID starting with 1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6 not found: ID does not exist" containerID="1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6" Mar 20 15:58:12 crc kubenswrapper[4779]: I0320 15:58:12.541187 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6"} err="failed to get container status \"1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6\": rpc error: code = NotFound desc = could not find container \"1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6\": container with ID starting with 1a54cd0b9bec33eb4b6b4670eaa0c86437ab246f4aee15ade349b65236ac41a6 not found: ID does not exist" Mar 20 15:58:13 crc kubenswrapper[4779]: I0320 15:58:13.821669 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" path="/var/lib/kubelet/pods/9d564261-3696-4ee0-8945-e28c8082c1e5/volumes" Mar 20 15:58:15 crc kubenswrapper[4779]: I0320 15:58:15.433994 4779 generic.go:334] "Generic (PLEG): container finished" podID="a81a28a7-b38a-4491-ba2a-7de988a7e02f" containerID="114634745a66c7210dc0c263f631a43d3482704a7217daa2ced9da13e8fe48a8" exitCode=0 Mar 20 15:58:15 crc kubenswrapper[4779]: I0320 15:58:15.434042 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" event={"ID":"a81a28a7-b38a-4491-ba2a-7de988a7e02f","Type":"ContainerDied","Data":"114634745a66c7210dc0c263f631a43d3482704a7217daa2ced9da13e8fe48a8"} Mar 20 15:58:16 crc kubenswrapper[4779]: I0320 15:58:16.870029 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:16 crc kubenswrapper[4779]: I0320 15:58:16.966715 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-inventory-0\") pod \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " Mar 20 15:58:16 crc kubenswrapper[4779]: I0320 15:58:16.966774 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cfgt\" (UniqueName: \"kubernetes.io/projected/a81a28a7-b38a-4491-ba2a-7de988a7e02f-kube-api-access-9cfgt\") pod \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " Mar 20 15:58:16 crc kubenswrapper[4779]: I0320 15:58:16.966848 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-ssh-key-openstack-edpm-ipam\") pod \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\" (UID: \"a81a28a7-b38a-4491-ba2a-7de988a7e02f\") " Mar 20 15:58:16 crc kubenswrapper[4779]: I0320 15:58:16.973310 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81a28a7-b38a-4491-ba2a-7de988a7e02f-kube-api-access-9cfgt" (OuterVolumeSpecName: "kube-api-access-9cfgt") pod "a81a28a7-b38a-4491-ba2a-7de988a7e02f" (UID: "a81a28a7-b38a-4491-ba2a-7de988a7e02f"). InnerVolumeSpecName "kube-api-access-9cfgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:16 crc kubenswrapper[4779]: I0320 15:58:16.995353 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a81a28a7-b38a-4491-ba2a-7de988a7e02f" (UID: "a81a28a7-b38a-4491-ba2a-7de988a7e02f"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:58:16 crc kubenswrapper[4779]: I0320 15:58:16.995718 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a81a28a7-b38a-4491-ba2a-7de988a7e02f" (UID: "a81a28a7-b38a-4491-ba2a-7de988a7e02f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.068912 4779 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.068945 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cfgt\" (UniqueName: \"kubernetes.io/projected/a81a28a7-b38a-4491-ba2a-7de988a7e02f-kube-api-access-9cfgt\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.068956 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a81a28a7-b38a-4491-ba2a-7de988a7e02f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.459389 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" event={"ID":"a81a28a7-b38a-4491-ba2a-7de988a7e02f","Type":"ContainerDied","Data":"bae15ad9add47ef1dbd62957ca520e7d56d4372d617fd4d1b9c7c84ab0149415"} Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.459416 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zggqb" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.459438 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bae15ad9add47ef1dbd62957ca520e7d56d4372d617fd4d1b9c7c84ab0149415" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.531795 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4"] Mar 20 15:58:17 crc kubenswrapper[4779]: E0320 15:58:17.532163 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="extract-content" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.532183 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="extract-content" Mar 20 15:58:17 crc kubenswrapper[4779]: E0320 15:58:17.532206 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="extract-utilities" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.532214 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="extract-utilities" Mar 20 15:58:17 crc kubenswrapper[4779]: E0320 15:58:17.532244 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81a28a7-b38a-4491-ba2a-7de988a7e02f" containerName="ssh-known-hosts-edpm-deployment" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.532250 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81a28a7-b38a-4491-ba2a-7de988a7e02f" containerName="ssh-known-hosts-edpm-deployment" Mar 20 15:58:17 crc kubenswrapper[4779]: E0320 15:58:17.532271 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="registry-server" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.532277 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="registry-server" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.532452 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d564261-3696-4ee0-8945-e28c8082c1e5" containerName="registry-server" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.532469 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81a28a7-b38a-4491-ba2a-7de988a7e02f" containerName="ssh-known-hosts-edpm-deployment" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.533619 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.535867 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.536025 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.536227 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.537997 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.542044 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4"] Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.680205 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb8zr\" (UniqueName: \"kubernetes.io/projected/4448472a-99b1-4060-aab8-4301c9de37a2-kube-api-access-xb8zr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.680286 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.680450 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.781755 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.781939 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.782012 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb8zr\" (UniqueName: \"kubernetes.io/projected/4448472a-99b1-4060-aab8-4301c9de37a2-kube-api-access-xb8zr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.785882 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.786565 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.798750 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb8zr\" (UniqueName: \"kubernetes.io/projected/4448472a-99b1-4060-aab8-4301c9de37a2-kube-api-access-xb8zr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-px8s4\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:17 crc kubenswrapper[4779]: I0320 15:58:17.849524 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:18 crc kubenswrapper[4779]: I0320 15:58:18.034289 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vjfs8"] Mar 20 15:58:18 crc kubenswrapper[4779]: I0320 15:58:18.044572 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vjfs8"] Mar 20 15:58:18 crc kubenswrapper[4779]: I0320 15:58:18.400536 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4"] Mar 20 15:58:18 crc kubenswrapper[4779]: I0320 15:58:18.468980 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" event={"ID":"4448472a-99b1-4060-aab8-4301c9de37a2","Type":"ContainerStarted","Data":"9bcecc314c5c96449edcad5233d90bc08cd1740b05bc68d240c8b457a56529be"} Mar 20 15:58:19 crc kubenswrapper[4779]: I0320 15:58:19.478635 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" event={"ID":"4448472a-99b1-4060-aab8-4301c9de37a2","Type":"ContainerStarted","Data":"1fa0f9148fb3a5d5e8828453a1f5742bfec5d3e48810697aae1f89e32ce36fc3"} Mar 20 15:58:19 crc kubenswrapper[4779]: I0320 15:58:19.503779 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" podStartSLOduration=1.959448509 podStartE2EDuration="2.503761282s" podCreationTimestamp="2026-03-20 15:58:17 +0000 UTC" firstStartedPulling="2026-03-20 15:58:18.404559505 +0000 UTC m=+2115.367075305" lastFinishedPulling="2026-03-20 15:58:18.948872278 +0000 UTC m=+2115.911388078" observedRunningTime="2026-03-20 15:58:19.494968924 +0000 UTC m=+2116.457484724" watchObservedRunningTime="2026-03-20 15:58:19.503761282 +0000 UTC m=+2116.466277082" Mar 20 15:58:19 crc kubenswrapper[4779]: I0320 15:58:19.819959 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78be39d2-e84b-42b6-a4d4-7abaeaaf1a57" path="/var/lib/kubelet/pods/78be39d2-e84b-42b6-a4d4-7abaeaaf1a57/volumes" Mar 20 15:58:25 crc kubenswrapper[4779]: I0320 15:58:25.149501 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:58:25 crc kubenswrapper[4779]: I0320 15:58:25.150071 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:58:27 crc kubenswrapper[4779]: I0320 15:58:27.552381 4779 generic.go:334] "Generic (PLEG): container finished" podID="4448472a-99b1-4060-aab8-4301c9de37a2" containerID="1fa0f9148fb3a5d5e8828453a1f5742bfec5d3e48810697aae1f89e32ce36fc3" exitCode=0 Mar 20 15:58:27 crc kubenswrapper[4779]: I0320 15:58:27.552480 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" event={"ID":"4448472a-99b1-4060-aab8-4301c9de37a2","Type":"ContainerDied","Data":"1fa0f9148fb3a5d5e8828453a1f5742bfec5d3e48810697aae1f89e32ce36fc3"} Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.000382 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.080806 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb8zr\" (UniqueName: \"kubernetes.io/projected/4448472a-99b1-4060-aab8-4301c9de37a2-kube-api-access-xb8zr\") pod \"4448472a-99b1-4060-aab8-4301c9de37a2\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.080909 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-ssh-key-openstack-edpm-ipam\") pod \"4448472a-99b1-4060-aab8-4301c9de37a2\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.080955 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-inventory\") pod \"4448472a-99b1-4060-aab8-4301c9de37a2\" (UID: \"4448472a-99b1-4060-aab8-4301c9de37a2\") " Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.088542 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4448472a-99b1-4060-aab8-4301c9de37a2-kube-api-access-xb8zr" (OuterVolumeSpecName: "kube-api-access-xb8zr") pod "4448472a-99b1-4060-aab8-4301c9de37a2" (UID: "4448472a-99b1-4060-aab8-4301c9de37a2"). InnerVolumeSpecName "kube-api-access-xb8zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.110802 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-inventory" (OuterVolumeSpecName: "inventory") pod "4448472a-99b1-4060-aab8-4301c9de37a2" (UID: "4448472a-99b1-4060-aab8-4301c9de37a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.114754 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4448472a-99b1-4060-aab8-4301c9de37a2" (UID: "4448472a-99b1-4060-aab8-4301c9de37a2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.183864 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb8zr\" (UniqueName: \"kubernetes.io/projected/4448472a-99b1-4060-aab8-4301c9de37a2-kube-api-access-xb8zr\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.183902 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.183913 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4448472a-99b1-4060-aab8-4301c9de37a2-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.576718 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" event={"ID":"4448472a-99b1-4060-aab8-4301c9de37a2","Type":"ContainerDied","Data":"9bcecc314c5c96449edcad5233d90bc08cd1740b05bc68d240c8b457a56529be"} Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.576774 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bcecc314c5c96449edcad5233d90bc08cd1740b05bc68d240c8b457a56529be" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.576785 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-px8s4" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.656310 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk"] Mar 20 15:58:29 crc kubenswrapper[4779]: E0320 15:58:29.657004 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4448472a-99b1-4060-aab8-4301c9de37a2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.657022 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="4448472a-99b1-4060-aab8-4301c9de37a2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.657240 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="4448472a-99b1-4060-aab8-4301c9de37a2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.657987 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.662499 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.662753 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.662943 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.663517 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.671951 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk"] Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.698718 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.698827 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrcvf\" (UniqueName: \"kubernetes.io/projected/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-kube-api-access-rrcvf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.698909 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.800351 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.800437 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrcvf\" (UniqueName: \"kubernetes.io/projected/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-kube-api-access-rrcvf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.800498 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.805546 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.818713 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.823190 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrcvf\" (UniqueName: \"kubernetes.io/projected/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-kube-api-access-rrcvf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:29 crc kubenswrapper[4779]: I0320 15:58:29.977145 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:30 crc kubenswrapper[4779]: I0320 15:58:30.546329 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk"] Mar 20 15:58:30 crc kubenswrapper[4779]: I0320 15:58:30.585511 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" event={"ID":"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce","Type":"ContainerStarted","Data":"3786652d0d77fc58b56c2286f0cc5837a58a41a9ea71235f06ac252d2b293912"} Mar 20 15:58:31 crc kubenswrapper[4779]: I0320 15:58:31.631734 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" event={"ID":"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce","Type":"ContainerStarted","Data":"493f126cf57ecd6731a7323ead63dc6423c451815c1ca558c8a73c59aabad645"} Mar 20 15:58:31 crc kubenswrapper[4779]: I0320 15:58:31.653876 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" podStartSLOduration=1.881970527 podStartE2EDuration="2.653858262s" podCreationTimestamp="2026-03-20 15:58:29 +0000 UTC" firstStartedPulling="2026-03-20 15:58:30.549334855 +0000 UTC m=+2127.511850655" lastFinishedPulling="2026-03-20 15:58:31.3212226 +0000 UTC m=+2128.283738390" observedRunningTime="2026-03-20 15:58:31.646363637 +0000 UTC m=+2128.608879447" watchObservedRunningTime="2026-03-20 15:58:31.653858262 +0000 UTC m=+2128.616374052" Mar 20 15:58:40 crc kubenswrapper[4779]: I0320 15:58:40.705485 4779 generic.go:334] "Generic (PLEG): container finished" podID="6f592faf-9ec7-4ae5-8e95-3b45d7de27ce" containerID="493f126cf57ecd6731a7323ead63dc6423c451815c1ca558c8a73c59aabad645" exitCode=0 Mar 20 15:58:40 crc kubenswrapper[4779]: I0320 15:58:40.705583 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" event={"ID":"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce","Type":"ContainerDied","Data":"493f126cf57ecd6731a7323ead63dc6423c451815c1ca558c8a73c59aabad645"} Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.241445 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.346243 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-inventory\") pod \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.346465 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-ssh-key-openstack-edpm-ipam\") pod \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.346506 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrcvf\" (UniqueName: \"kubernetes.io/projected/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-kube-api-access-rrcvf\") pod \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\" (UID: \"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce\") " Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.351916 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-kube-api-access-rrcvf" (OuterVolumeSpecName: "kube-api-access-rrcvf") pod "6f592faf-9ec7-4ae5-8e95-3b45d7de27ce" (UID: "6f592faf-9ec7-4ae5-8e95-3b45d7de27ce"). InnerVolumeSpecName "kube-api-access-rrcvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.373847 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-inventory" (OuterVolumeSpecName: "inventory") pod "6f592faf-9ec7-4ae5-8e95-3b45d7de27ce" (UID: "6f592faf-9ec7-4ae5-8e95-3b45d7de27ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.375357 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6f592faf-9ec7-4ae5-8e95-3b45d7de27ce" (UID: "6f592faf-9ec7-4ae5-8e95-3b45d7de27ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.448956 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.449018 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.449033 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrcvf\" (UniqueName: \"kubernetes.io/projected/6f592faf-9ec7-4ae5-8e95-3b45d7de27ce-kube-api-access-rrcvf\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.725197 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" event={"ID":"6f592faf-9ec7-4ae5-8e95-3b45d7de27ce","Type":"ContainerDied","Data":"3786652d0d77fc58b56c2286f0cc5837a58a41a9ea71235f06ac252d2b293912"} Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.725484 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3786652d0d77fc58b56c2286f0cc5837a58a41a9ea71235f06ac252d2b293912" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.725300 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.823388 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs"] Mar 20 15:58:42 crc kubenswrapper[4779]: E0320 15:58:42.824163 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f592faf-9ec7-4ae5-8e95-3b45d7de27ce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.824182 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f592faf-9ec7-4ae5-8e95-3b45d7de27ce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.824383 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f592faf-9ec7-4ae5-8e95-3b45d7de27ce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.825062 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.834389 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.834433 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.834434 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.834607 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.834829 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.834917 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.835902 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.836158 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.844808 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs"] Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.860396 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.860582 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.860641 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-kube-api-access-lmgdg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.860701 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.860753 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.860807 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.860856 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.860969 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.861154 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.861238 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.861319 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.861407 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.861474 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.861516 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963172 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963229 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-kube-api-access-lmgdg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963291 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963336 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963394 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963424 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963526 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963584 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963615 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.963673 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.964248 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.964414 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.964454 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.964545 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.969631 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.970286 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.970745 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.971788 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.972454 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.972467 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.976163 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.976278 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.976275 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.977486 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.977599 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.978377 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.981707 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:42 crc kubenswrapper[4779]: I0320 15:58:42.983221 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-kube-api-access-lmgdg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:43 crc kubenswrapper[4779]: I0320 15:58:43.159841 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:58:43 crc kubenswrapper[4779]: I0320 15:58:43.317696 4779 scope.go:117] "RemoveContainer" containerID="b08b706ad2c830ce0fc817f6743fe7bb019e2bd157e4d82791a91a1279f9fb99" Mar 20 15:58:43 crc kubenswrapper[4779]: I0320 15:58:43.368025 4779 scope.go:117] "RemoveContainer" containerID="78b4899d7c894c3bee42d4192002ebcb53d21bb0825a660599d7ffafc76aa535" Mar 20 15:58:43 crc kubenswrapper[4779]: W0320 15:58:43.753669 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f3c88f1_f136_4908_bf87_9a114dc67cf7.slice/crio-9e10820abc91aacfe977043d9b003ab1ab708758abeb085d282e756215716ea6 WatchSource:0}: Error finding container 9e10820abc91aacfe977043d9b003ab1ab708758abeb085d282e756215716ea6: Status 404 returned error can't find the container with id 9e10820abc91aacfe977043d9b003ab1ab708758abeb085d282e756215716ea6 Mar 20 15:58:43 crc kubenswrapper[4779]: I0320 15:58:43.757204 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:58:43 crc kubenswrapper[4779]: I0320 15:58:43.760200 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs"] Mar 20 15:58:44 crc kubenswrapper[4779]: I0320 15:58:44.775314 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" event={"ID":"3f3c88f1-f136-4908-bf87-9a114dc67cf7","Type":"ContainerStarted","Data":"14445ceda62381d53a81cbb59d9371f35b33a9a74504abe7c188a64bb177a571"} Mar 20 15:58:44 crc kubenswrapper[4779]: I0320 15:58:44.776671 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" event={"ID":"3f3c88f1-f136-4908-bf87-9a114dc67cf7","Type":"ContainerStarted","Data":"9e10820abc91aacfe977043d9b003ab1ab708758abeb085d282e756215716ea6"} Mar 20 15:58:44 crc kubenswrapper[4779]: I0320 15:58:44.803453 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" podStartSLOduration=2.416639789 podStartE2EDuration="2.803434313s" podCreationTimestamp="2026-03-20 15:58:42 +0000 UTC" firstStartedPulling="2026-03-20 15:58:43.756839372 +0000 UTC m=+2140.719355172" lastFinishedPulling="2026-03-20 15:58:44.143633896 +0000 UTC m=+2141.106149696" observedRunningTime="2026-03-20 15:58:44.794611158 +0000 UTC m=+2141.757126958" watchObservedRunningTime="2026-03-20 15:58:44.803434313 +0000 UTC m=+2141.765950103" Mar 20 15:58:55 crc kubenswrapper[4779]: I0320 15:58:55.150183 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:58:55 crc kubenswrapper[4779]: I0320 15:58:55.150724 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:59:19 crc kubenswrapper[4779]: I0320 15:59:19.068745 4779 generic.go:334] "Generic (PLEG): container finished" podID="3f3c88f1-f136-4908-bf87-9a114dc67cf7" containerID="14445ceda62381d53a81cbb59d9371f35b33a9a74504abe7c188a64bb177a571" exitCode=0 Mar 20 15:59:19 crc kubenswrapper[4779]: I0320 15:59:19.068852 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" event={"ID":"3f3c88f1-f136-4908-bf87-9a114dc67cf7","Type":"ContainerDied","Data":"14445ceda62381d53a81cbb59d9371f35b33a9a74504abe7c188a64bb177a571"} Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.478333 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.576862 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.576920 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-kube-api-access-lmgdg\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.576972 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ovn-combined-ca-bundle\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577075 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-neutron-metadata-combined-ca-bundle\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577144 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ssh-key-openstack-edpm-ipam\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577178 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-libvirt-combined-ca-bundle\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577228 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577279 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577331 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-bootstrap-combined-ca-bundle\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577351 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-nova-combined-ca-bundle\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577570 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-telemetry-combined-ca-bundle\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577599 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577621 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-repo-setup-combined-ca-bundle\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.577669 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.583651 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.583982 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-kube-api-access-lmgdg" (OuterVolumeSpecName: "kube-api-access-lmgdg") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "kube-api-access-lmgdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.584666 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.584718 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.584854 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.585366 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.586405 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.587183 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.587333 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.588352 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.590937 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.591901 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: E0320 15:59:20.608818 4779 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory podName:3f3c88f1-f136-4908-bf87-9a114dc67cf7 nodeName:}" failed. No retries permitted until 2026-03-20 15:59:21.10879078 +0000 UTC m=+2178.071306580 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7") : error deleting /var/lib/kubelet/pods/3f3c88f1-f136-4908-bf87-9a114dc67cf7/volume-subpaths: remove /var/lib/kubelet/pods/3f3c88f1-f136-4908-bf87-9a114dc67cf7/volume-subpaths: no such file or directory Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.611387 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680404 4779 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680435 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680445 4779 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680455 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680466 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680476 4779 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680485 4779 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680494 4779 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680503 4779 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680511 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680520 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680529 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/3f3c88f1-f136-4908-bf87-9a114dc67cf7-kube-api-access-lmgdg\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:20 crc kubenswrapper[4779]: I0320 15:59:20.680537 4779 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.087552 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" event={"ID":"3f3c88f1-f136-4908-bf87-9a114dc67cf7","Type":"ContainerDied","Data":"9e10820abc91aacfe977043d9b003ab1ab708758abeb085d282e756215716ea6"} Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.087596 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e10820abc91aacfe977043d9b003ab1ab708758abeb085d282e756215716ea6" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.087685 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.196728 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory\") pod \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\" (UID: \"3f3c88f1-f136-4908-bf87-9a114dc67cf7\") " Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.201650 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory" (OuterVolumeSpecName: "inventory") pod "3f3c88f1-f136-4908-bf87-9a114dc67cf7" (UID: "3f3c88f1-f136-4908-bf87-9a114dc67cf7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.297560 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8"] Mar 20 15:59:21 crc kubenswrapper[4779]: E0320 15:59:21.297960 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3c88f1-f136-4908-bf87-9a114dc67cf7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.297978 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3c88f1-f136-4908-bf87-9a114dc67cf7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.298194 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3c88f1-f136-4908-bf87-9a114dc67cf7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.298897 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.300165 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f3c88f1-f136-4908-bf87-9a114dc67cf7-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.305473 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.318125 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8"] Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.401805 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.401899 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmwfk\" (UniqueName: \"kubernetes.io/projected/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-kube-api-access-lmwfk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.401978 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.402030 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.402200 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.505298 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.505417 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.505474 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmwfk\" (UniqueName: \"kubernetes.io/projected/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-kube-api-access-lmwfk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.505524 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.505568 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.506424 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.509299 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.509388 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.509681 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.520436 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmwfk\" (UniqueName: \"kubernetes.io/projected/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-kube-api-access-lmwfk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vlwk8\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:21 crc kubenswrapper[4779]: I0320 15:59:21.664458 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 15:59:22 crc kubenswrapper[4779]: W0320 15:59:22.198246 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55a17de8_e79b_4949_8dfa_d5c4f1fd8917.slice/crio-06735c27d5e758578414439a219e1a54371e9742ef6965b36655036e9cf347f2 WatchSource:0}: Error finding container 06735c27d5e758578414439a219e1a54371e9742ef6965b36655036e9cf347f2: Status 404 returned error can't find the container with id 06735c27d5e758578414439a219e1a54371e9742ef6965b36655036e9cf347f2 Mar 20 15:59:22 crc kubenswrapper[4779]: I0320 15:59:22.208219 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8"] Mar 20 15:59:23 crc kubenswrapper[4779]: I0320 15:59:23.108100 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" event={"ID":"55a17de8-e79b-4949-8dfa-d5c4f1fd8917","Type":"ContainerStarted","Data":"0d564f926fef1cff8c3e7ef451fd383c6d09c68f8e9a0f11705977499f13ab82"} Mar 20 15:59:23 crc kubenswrapper[4779]: I0320 15:59:23.108466 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" event={"ID":"55a17de8-e79b-4949-8dfa-d5c4f1fd8917","Type":"ContainerStarted","Data":"06735c27d5e758578414439a219e1a54371e9742ef6965b36655036e9cf347f2"} Mar 20 15:59:23 crc kubenswrapper[4779]: I0320 15:59:23.136349 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" podStartSLOduration=1.622934602 podStartE2EDuration="2.136313721s" podCreationTimestamp="2026-03-20 15:59:21 +0000 UTC" firstStartedPulling="2026-03-20 15:59:22.201274335 +0000 UTC m=+2179.163790135" lastFinishedPulling="2026-03-20 15:59:22.714653454 +0000 UTC m=+2179.677169254" observedRunningTime="2026-03-20 15:59:23.129209537 +0000 UTC m=+2180.091725337" watchObservedRunningTime="2026-03-20 15:59:23.136313721 +0000 UTC m=+2180.098829561" Mar 20 15:59:25 crc kubenswrapper[4779]: I0320 15:59:25.149603 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:59:25 crc kubenswrapper[4779]: I0320 15:59:25.149921 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:59:25 crc kubenswrapper[4779]: I0320 15:59:25.149960 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 15:59:25 crc kubenswrapper[4779]: I0320 15:59:25.150866 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56c8e1388500b86731bed3431c69ab3383dde78a7cf6321fabe80d4cca0e9447"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:59:25 crc kubenswrapper[4779]: I0320 15:59:25.150922 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://56c8e1388500b86731bed3431c69ab3383dde78a7cf6321fabe80d4cca0e9447" gracePeriod=600 Mar 20 15:59:26 crc kubenswrapper[4779]: I0320 15:59:26.134487 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="56c8e1388500b86731bed3431c69ab3383dde78a7cf6321fabe80d4cca0e9447" exitCode=0 Mar 20 15:59:26 crc kubenswrapper[4779]: I0320 15:59:26.134550 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"56c8e1388500b86731bed3431c69ab3383dde78a7cf6321fabe80d4cca0e9447"} Mar 20 15:59:26 crc kubenswrapper[4779]: I0320 15:59:26.135042 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311"} Mar 20 15:59:26 crc kubenswrapper[4779]: I0320 15:59:26.135062 4779 scope.go:117] "RemoveContainer" containerID="3e74c77d1c87375df708c9fdb3dfd7572992fba6df21987d4baed94ab0fddd8f" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.154655 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567040-99p54"] Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.156778 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-99p54" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.160400 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.160853 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.160960 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.166557 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-99p54"] Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.187949 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg"] Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.189806 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.192299 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.193053 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.214241 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg"] Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.252639 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mcjw\" (UniqueName: \"kubernetes.io/projected/c70aa1a9-d9ff-4845-a8f9-26cbcadb893e-kube-api-access-6mcjw\") pod \"auto-csr-approver-29567040-99p54\" (UID: \"c70aa1a9-d9ff-4845-a8f9-26cbcadb893e\") " pod="openshift-infra/auto-csr-approver-29567040-99p54" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.354613 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a413327-15d1-4578-b601-759eb9cd73d2-secret-volume\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.354675 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzzr2\" (UniqueName: \"kubernetes.io/projected/1a413327-15d1-4578-b601-759eb9cd73d2-kube-api-access-zzzr2\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.354714 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mcjw\" (UniqueName: \"kubernetes.io/projected/c70aa1a9-d9ff-4845-a8f9-26cbcadb893e-kube-api-access-6mcjw\") pod \"auto-csr-approver-29567040-99p54\" (UID: \"c70aa1a9-d9ff-4845-a8f9-26cbcadb893e\") " pod="openshift-infra/auto-csr-approver-29567040-99p54" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.354832 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a413327-15d1-4578-b601-759eb9cd73d2-config-volume\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.374688 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mcjw\" (UniqueName: \"kubernetes.io/projected/c70aa1a9-d9ff-4845-a8f9-26cbcadb893e-kube-api-access-6mcjw\") pod \"auto-csr-approver-29567040-99p54\" (UID: \"c70aa1a9-d9ff-4845-a8f9-26cbcadb893e\") " pod="openshift-infra/auto-csr-approver-29567040-99p54" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.456640 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a413327-15d1-4578-b601-759eb9cd73d2-config-volume\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.456724 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a413327-15d1-4578-b601-759eb9cd73d2-secret-volume\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.456755 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzzr2\" (UniqueName: \"kubernetes.io/projected/1a413327-15d1-4578-b601-759eb9cd73d2-kube-api-access-zzzr2\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.457699 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a413327-15d1-4578-b601-759eb9cd73d2-config-volume\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.462587 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a413327-15d1-4578-b601-759eb9cd73d2-secret-volume\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.473883 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzzr2\" (UniqueName: \"kubernetes.io/projected/1a413327-15d1-4578-b601-759eb9cd73d2-kube-api-access-zzzr2\") pod \"collect-profiles-29567040-vfhjg\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.487746 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-99p54" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.515571 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:00 crc kubenswrapper[4779]: I0320 16:00:00.954262 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-99p54"] Mar 20 16:00:01 crc kubenswrapper[4779]: I0320 16:00:01.030870 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg"] Mar 20 16:00:01 crc kubenswrapper[4779]: I0320 16:00:01.478209 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-99p54" event={"ID":"c70aa1a9-d9ff-4845-a8f9-26cbcadb893e","Type":"ContainerStarted","Data":"ad979bbf1dba9ad73fd803c89d511ca05b8c5d38438c1948e67d1b034f53b2ea"} Mar 20 16:00:01 crc kubenswrapper[4779]: I0320 16:00:01.479868 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" event={"ID":"1a413327-15d1-4578-b601-759eb9cd73d2","Type":"ContainerStarted","Data":"cf67c4fd2844a9b14d48cd6b64456a73f08ff6664cffaed383d3f8a8b50df00d"} Mar 20 16:00:01 crc kubenswrapper[4779]: I0320 16:00:01.479917 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" event={"ID":"1a413327-15d1-4578-b601-759eb9cd73d2","Type":"ContainerStarted","Data":"b033cb284238f117416e3bbc9ffbc654f4d87f32b3589c8a1c885a306f7c03ee"} Mar 20 16:00:01 crc kubenswrapper[4779]: I0320 16:00:01.504727 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" podStartSLOduration=1.504706535 podStartE2EDuration="1.504706535s" podCreationTimestamp="2026-03-20 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:01.495784767 +0000 UTC m=+2218.458300567" watchObservedRunningTime="2026-03-20 16:00:01.504706535 +0000 UTC m=+2218.467222335" Mar 20 16:00:02 crc kubenswrapper[4779]: I0320 16:00:02.489389 4779 generic.go:334] "Generic (PLEG): container finished" podID="1a413327-15d1-4578-b601-759eb9cd73d2" containerID="cf67c4fd2844a9b14d48cd6b64456a73f08ff6664cffaed383d3f8a8b50df00d" exitCode=0 Mar 20 16:00:02 crc kubenswrapper[4779]: I0320 16:00:02.489432 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" event={"ID":"1a413327-15d1-4578-b601-759eb9cd73d2","Type":"ContainerDied","Data":"cf67c4fd2844a9b14d48cd6b64456a73f08ff6664cffaed383d3f8a8b50df00d"} Mar 20 16:00:03 crc kubenswrapper[4779]: I0320 16:00:03.863034 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.031673 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a413327-15d1-4578-b601-759eb9cd73d2-config-volume\") pod \"1a413327-15d1-4578-b601-759eb9cd73d2\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.032079 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a413327-15d1-4578-b601-759eb9cd73d2-secret-volume\") pod \"1a413327-15d1-4578-b601-759eb9cd73d2\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.032126 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzzr2\" (UniqueName: \"kubernetes.io/projected/1a413327-15d1-4578-b601-759eb9cd73d2-kube-api-access-zzzr2\") pod \"1a413327-15d1-4578-b601-759eb9cd73d2\" (UID: \"1a413327-15d1-4578-b601-759eb9cd73d2\") " Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.032807 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a413327-15d1-4578-b601-759eb9cd73d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a413327-15d1-4578-b601-759eb9cd73d2" (UID: "1a413327-15d1-4578-b601-759eb9cd73d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.038493 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a413327-15d1-4578-b601-759eb9cd73d2-kube-api-access-zzzr2" (OuterVolumeSpecName: "kube-api-access-zzzr2") pod "1a413327-15d1-4578-b601-759eb9cd73d2" (UID: "1a413327-15d1-4578-b601-759eb9cd73d2"). InnerVolumeSpecName "kube-api-access-zzzr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.038494 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a413327-15d1-4578-b601-759eb9cd73d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a413327-15d1-4578-b601-759eb9cd73d2" (UID: "1a413327-15d1-4578-b601-759eb9cd73d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.134096 4779 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a413327-15d1-4578-b601-759eb9cd73d2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.134148 4779 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a413327-15d1-4578-b601-759eb9cd73d2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.134162 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzzr2\" (UniqueName: \"kubernetes.io/projected/1a413327-15d1-4578-b601-759eb9cd73d2-kube-api-access-zzzr2\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.505983 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" event={"ID":"1a413327-15d1-4578-b601-759eb9cd73d2","Type":"ContainerDied","Data":"b033cb284238f117416e3bbc9ffbc654f4d87f32b3589c8a1c885a306f7c03ee"} Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.506021 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b033cb284238f117416e3bbc9ffbc654f4d87f32b3589c8a1c885a306f7c03ee" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.506072 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-vfhjg" Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.577608 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb"] Mar 20 16:00:04 crc kubenswrapper[4779]: I0320 16:00:04.599166 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-zptlb"] Mar 20 16:00:05 crc kubenswrapper[4779]: I0320 16:00:05.515018 4779 generic.go:334] "Generic (PLEG): container finished" podID="c70aa1a9-d9ff-4845-a8f9-26cbcadb893e" containerID="31cfa481a64db61aa9d8823dd7f5424c7de78ca5d19d859a3e97e6c7f7adc8b7" exitCode=0 Mar 20 16:00:05 crc kubenswrapper[4779]: I0320 16:00:05.515220 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-99p54" event={"ID":"c70aa1a9-d9ff-4845-a8f9-26cbcadb893e","Type":"ContainerDied","Data":"31cfa481a64db61aa9d8823dd7f5424c7de78ca5d19d859a3e97e6c7f7adc8b7"} Mar 20 16:00:05 crc kubenswrapper[4779]: I0320 16:00:05.823698 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ee4d33-4d04-496b-b95c-61db87d00cdc" path="/var/lib/kubelet/pods/84ee4d33-4d04-496b-b95c-61db87d00cdc/volumes" Mar 20 16:00:06 crc kubenswrapper[4779]: I0320 16:00:06.855320 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-99p54" Mar 20 16:00:06 crc kubenswrapper[4779]: I0320 16:00:06.987880 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mcjw\" (UniqueName: \"kubernetes.io/projected/c70aa1a9-d9ff-4845-a8f9-26cbcadb893e-kube-api-access-6mcjw\") pod \"c70aa1a9-d9ff-4845-a8f9-26cbcadb893e\" (UID: \"c70aa1a9-d9ff-4845-a8f9-26cbcadb893e\") " Mar 20 16:00:06 crc kubenswrapper[4779]: I0320 16:00:06.997667 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70aa1a9-d9ff-4845-a8f9-26cbcadb893e-kube-api-access-6mcjw" (OuterVolumeSpecName: "kube-api-access-6mcjw") pod "c70aa1a9-d9ff-4845-a8f9-26cbcadb893e" (UID: "c70aa1a9-d9ff-4845-a8f9-26cbcadb893e"). InnerVolumeSpecName "kube-api-access-6mcjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:00:07 crc kubenswrapper[4779]: I0320 16:00:07.090792 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mcjw\" (UniqueName: \"kubernetes.io/projected/c70aa1a9-d9ff-4845-a8f9-26cbcadb893e-kube-api-access-6mcjw\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:07 crc kubenswrapper[4779]: I0320 16:00:07.536300 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-99p54" event={"ID":"c70aa1a9-d9ff-4845-a8f9-26cbcadb893e","Type":"ContainerDied","Data":"ad979bbf1dba9ad73fd803c89d511ca05b8c5d38438c1948e67d1b034f53b2ea"} Mar 20 16:00:07 crc kubenswrapper[4779]: I0320 16:00:07.536344 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad979bbf1dba9ad73fd803c89d511ca05b8c5d38438c1948e67d1b034f53b2ea" Mar 20 16:00:07 crc kubenswrapper[4779]: I0320 16:00:07.536685 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-99p54" Mar 20 16:00:07 crc kubenswrapper[4779]: I0320 16:00:07.919764 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-zqfhn"] Mar 20 16:00:07 crc kubenswrapper[4779]: I0320 16:00:07.929948 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-zqfhn"] Mar 20 16:00:09 crc kubenswrapper[4779]: I0320 16:00:09.820770 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a066553c-0cc4-4dd8-bd44-1d903896201e" path="/var/lib/kubelet/pods/a066553c-0cc4-4dd8-bd44-1d903896201e/volumes" Mar 20 16:00:20 crc kubenswrapper[4779]: I0320 16:00:20.650826 4779 generic.go:334] "Generic (PLEG): container finished" podID="55a17de8-e79b-4949-8dfa-d5c4f1fd8917" containerID="0d564f926fef1cff8c3e7ef451fd383c6d09c68f8e9a0f11705977499f13ab82" exitCode=0 Mar 20 16:00:20 crc kubenswrapper[4779]: I0320 16:00:20.650916 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" event={"ID":"55a17de8-e79b-4949-8dfa-d5c4f1fd8917","Type":"ContainerDied","Data":"0d564f926fef1cff8c3e7ef451fd383c6d09c68f8e9a0f11705977499f13ab82"} Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.071719 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.092041 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovn-combined-ca-bundle\") pod \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.092150 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ssh-key-openstack-edpm-ipam\") pod \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.092193 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmwfk\" (UniqueName: \"kubernetes.io/projected/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-kube-api-access-lmwfk\") pod \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.092224 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovncontroller-config-0\") pod \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.092370 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-inventory\") pod \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\" (UID: \"55a17de8-e79b-4949-8dfa-d5c4f1fd8917\") " Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.098530 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "55a17de8-e79b-4949-8dfa-d5c4f1fd8917" (UID: "55a17de8-e79b-4949-8dfa-d5c4f1fd8917"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.104337 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-kube-api-access-lmwfk" (OuterVolumeSpecName: "kube-api-access-lmwfk") pod "55a17de8-e79b-4949-8dfa-d5c4f1fd8917" (UID: "55a17de8-e79b-4949-8dfa-d5c4f1fd8917"). InnerVolumeSpecName "kube-api-access-lmwfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.124444 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "55a17de8-e79b-4949-8dfa-d5c4f1fd8917" (UID: "55a17de8-e79b-4949-8dfa-d5c4f1fd8917"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.132270 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "55a17de8-e79b-4949-8dfa-d5c4f1fd8917" (UID: "55a17de8-e79b-4949-8dfa-d5c4f1fd8917"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.133012 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-inventory" (OuterVolumeSpecName: "inventory") pod "55a17de8-e79b-4949-8dfa-d5c4f1fd8917" (UID: "55a17de8-e79b-4949-8dfa-d5c4f1fd8917"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.195483 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.195521 4779 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.195532 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.195541 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmwfk\" (UniqueName: \"kubernetes.io/projected/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-kube-api-access-lmwfk\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.195550 4779 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/55a17de8-e79b-4949-8dfa-d5c4f1fd8917-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.670218 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" event={"ID":"55a17de8-e79b-4949-8dfa-d5c4f1fd8917","Type":"ContainerDied","Data":"06735c27d5e758578414439a219e1a54371e9742ef6965b36655036e9cf347f2"} Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.670271 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06735c27d5e758578414439a219e1a54371e9742ef6965b36655036e9cf347f2" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.670335 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vlwk8" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.748096 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt"] Mar 20 16:00:22 crc kubenswrapper[4779]: E0320 16:00:22.748514 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a17de8-e79b-4949-8dfa-d5c4f1fd8917" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.748531 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a17de8-e79b-4949-8dfa-d5c4f1fd8917" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 16:00:22 crc kubenswrapper[4779]: E0320 16:00:22.748554 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70aa1a9-d9ff-4845-a8f9-26cbcadb893e" containerName="oc" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.748560 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70aa1a9-d9ff-4845-a8f9-26cbcadb893e" containerName="oc" Mar 20 16:00:22 crc kubenswrapper[4779]: E0320 16:00:22.748575 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a413327-15d1-4578-b601-759eb9cd73d2" containerName="collect-profiles" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.748581 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a413327-15d1-4578-b601-759eb9cd73d2" containerName="collect-profiles" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.748758 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70aa1a9-d9ff-4845-a8f9-26cbcadb893e" containerName="oc" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.748778 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a17de8-e79b-4949-8dfa-d5c4f1fd8917" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.748796 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a413327-15d1-4578-b601-759eb9cd73d2" containerName="collect-profiles" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.749837 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.752676 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.753137 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.753305 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.753507 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.753669 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.753825 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.760583 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt"] Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.806426 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.806475 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.806502 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.806521 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.806703 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.806881 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpcxt\" (UniqueName: \"kubernetes.io/projected/9bbeb643-62cf-4487-b133-c1c618fe49d7-kube-api-access-zpcxt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.909151 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.909211 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.909242 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.909278 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.909319 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.909391 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpcxt\" (UniqueName: \"kubernetes.io/projected/9bbeb643-62cf-4487-b133-c1c618fe49d7-kube-api-access-zpcxt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.913567 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.913638 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.913926 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.913974 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.914502 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:22 crc kubenswrapper[4779]: I0320 16:00:22.929434 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpcxt\" (UniqueName: \"kubernetes.io/projected/9bbeb643-62cf-4487-b133-c1c618fe49d7-kube-api-access-zpcxt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:23 crc kubenswrapper[4779]: I0320 16:00:23.070296 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:00:23 crc kubenswrapper[4779]: I0320 16:00:23.560045 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt"] Mar 20 16:00:23 crc kubenswrapper[4779]: I0320 16:00:23.679286 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" event={"ID":"9bbeb643-62cf-4487-b133-c1c618fe49d7","Type":"ContainerStarted","Data":"51b4cc64d4064795da4d6908de1dc3c49ef5e362f3087cb77a3c31ee835b4bd8"} Mar 20 16:00:24 crc kubenswrapper[4779]: I0320 16:00:24.688553 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" event={"ID":"9bbeb643-62cf-4487-b133-c1c618fe49d7","Type":"ContainerStarted","Data":"e85532eeda04d90918393acf00c4e58e812469daf0721444a662876b2cf317fc"} Mar 20 16:00:24 crc kubenswrapper[4779]: I0320 16:00:24.735688 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" podStartSLOduration=2.019831041 podStartE2EDuration="2.735670839s" podCreationTimestamp="2026-03-20 16:00:22 +0000 UTC" firstStartedPulling="2026-03-20 16:00:23.565572408 +0000 UTC m=+2240.528088208" lastFinishedPulling="2026-03-20 16:00:24.281412206 +0000 UTC m=+2241.243928006" observedRunningTime="2026-03-20 16:00:24.713663911 +0000 UTC m=+2241.676179711" watchObservedRunningTime="2026-03-20 16:00:24.735670839 +0000 UTC m=+2241.698186639" Mar 20 16:00:43 crc kubenswrapper[4779]: I0320 16:00:43.495853 4779 scope.go:117] "RemoveContainer" containerID="9535441a36321bb8ad0abd81aaff16dab4c8d9811e6c564e93f5d48fd6300955" Mar 20 16:00:43 crc kubenswrapper[4779]: I0320 16:00:43.533573 4779 scope.go:117] "RemoveContainer" containerID="e8bfcd1b8c2a7f0a81258715645b51ff30c7d193f2b06ab847c178e334ff3a72" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.737183 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q5lvz"] Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.741456 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.746547 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q5lvz"] Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.872247 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-utilities\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.872300 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-catalog-content\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.872586 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kpql\" (UniqueName: \"kubernetes.io/projected/da58aacb-8c26-4153-958f-7e841a281910-kube-api-access-7kpql\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.974677 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-utilities\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.974728 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-catalog-content\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.974830 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kpql\" (UniqueName: \"kubernetes.io/projected/da58aacb-8c26-4153-958f-7e841a281910-kube-api-access-7kpql\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.975463 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-catalog-content\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.975954 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-utilities\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:00:59 crc kubenswrapper[4779]: I0320 16:00:59.995672 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kpql\" (UniqueName: \"kubernetes.io/projected/da58aacb-8c26-4153-958f-7e841a281910-kube-api-access-7kpql\") pod \"community-operators-q5lvz\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.060342 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.175050 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567041-mqqts"] Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.187201 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.208269 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567041-mqqts"] Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.366326 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-config-data\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.366693 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzq27\" (UniqueName: \"kubernetes.io/projected/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-kube-api-access-rzq27\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.366774 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-fernet-keys\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.366824 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-combined-ca-bundle\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.467997 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-fernet-keys\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.468074 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-combined-ca-bundle\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.469423 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-config-data\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.469472 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzq27\" (UniqueName: \"kubernetes.io/projected/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-kube-api-access-rzq27\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.475403 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-combined-ca-bundle\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.483064 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-config-data\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.485660 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-fernet-keys\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.489843 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzq27\" (UniqueName: \"kubernetes.io/projected/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-kube-api-access-rzq27\") pod \"keystone-cron-29567041-mqqts\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.584693 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:00 crc kubenswrapper[4779]: I0320 16:01:00.657269 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q5lvz"] Mar 20 16:01:01 crc kubenswrapper[4779]: I0320 16:01:01.047391 4779 generic.go:334] "Generic (PLEG): container finished" podID="da58aacb-8c26-4153-958f-7e841a281910" containerID="9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c" exitCode=0 Mar 20 16:01:01 crc kubenswrapper[4779]: I0320 16:01:01.047514 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5lvz" event={"ID":"da58aacb-8c26-4153-958f-7e841a281910","Type":"ContainerDied","Data":"9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c"} Mar 20 16:01:01 crc kubenswrapper[4779]: I0320 16:01:01.047729 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5lvz" event={"ID":"da58aacb-8c26-4153-958f-7e841a281910","Type":"ContainerStarted","Data":"f4b325f02e0526b990e2784ca2c48763bcbc612f1ac252bbab371a33f014ef18"} Mar 20 16:01:01 crc kubenswrapper[4779]: I0320 16:01:01.075381 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567041-mqqts"] Mar 20 16:01:01 crc kubenswrapper[4779]: W0320 16:01:01.080246 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d9fcb2d_01f5_4d3b_8586_b0b700e68755.slice/crio-8cfd12ad29e0f0e7e939dbd31ad34656a377da7521aa8dc61660d62e7fe4f0ed WatchSource:0}: Error finding container 8cfd12ad29e0f0e7e939dbd31ad34656a377da7521aa8dc61660d62e7fe4f0ed: Status 404 returned error can't find the container with id 8cfd12ad29e0f0e7e939dbd31ad34656a377da7521aa8dc61660d62e7fe4f0ed Mar 20 16:01:02 crc kubenswrapper[4779]: I0320 16:01:02.060193 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-mqqts" event={"ID":"1d9fcb2d-01f5-4d3b-8586-b0b700e68755","Type":"ContainerStarted","Data":"13d52e90d97bf4b3fb06830dac34d093b65085b790f8d57c7676b8e9bfb4dd8f"} Mar 20 16:01:02 crc kubenswrapper[4779]: I0320 16:01:02.060541 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-mqqts" event={"ID":"1d9fcb2d-01f5-4d3b-8586-b0b700e68755","Type":"ContainerStarted","Data":"8cfd12ad29e0f0e7e939dbd31ad34656a377da7521aa8dc61660d62e7fe4f0ed"} Mar 20 16:01:02 crc kubenswrapper[4779]: I0320 16:01:02.067169 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5lvz" event={"ID":"da58aacb-8c26-4153-958f-7e841a281910","Type":"ContainerStarted","Data":"1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197"} Mar 20 16:01:02 crc kubenswrapper[4779]: I0320 16:01:02.087776 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567041-mqqts" podStartSLOduration=2.087760825 podStartE2EDuration="2.087760825s" podCreationTimestamp="2026-03-20 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:01:02.083252754 +0000 UTC m=+2279.045768554" watchObservedRunningTime="2026-03-20 16:01:02.087760825 +0000 UTC m=+2279.050276625" Mar 20 16:01:04 crc kubenswrapper[4779]: I0320 16:01:04.085270 4779 generic.go:334] "Generic (PLEG): container finished" podID="da58aacb-8c26-4153-958f-7e841a281910" containerID="1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197" exitCode=0 Mar 20 16:01:04 crc kubenswrapper[4779]: I0320 16:01:04.085375 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5lvz" event={"ID":"da58aacb-8c26-4153-958f-7e841a281910","Type":"ContainerDied","Data":"1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197"} Mar 20 16:01:04 crc kubenswrapper[4779]: I0320 16:01:04.091249 4779 generic.go:334] "Generic (PLEG): container finished" podID="1d9fcb2d-01f5-4d3b-8586-b0b700e68755" containerID="13d52e90d97bf4b3fb06830dac34d093b65085b790f8d57c7676b8e9bfb4dd8f" exitCode=0 Mar 20 16:01:04 crc kubenswrapper[4779]: I0320 16:01:04.091296 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-mqqts" event={"ID":"1d9fcb2d-01f5-4d3b-8586-b0b700e68755","Type":"ContainerDied","Data":"13d52e90d97bf4b3fb06830dac34d093b65085b790f8d57c7676b8e9bfb4dd8f"} Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.464723 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.580247 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzq27\" (UniqueName: \"kubernetes.io/projected/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-kube-api-access-rzq27\") pod \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.580696 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-config-data\") pod \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.580874 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-combined-ca-bundle\") pod \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.581501 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-fernet-keys\") pod \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\" (UID: \"1d9fcb2d-01f5-4d3b-8586-b0b700e68755\") " Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.586341 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-kube-api-access-rzq27" (OuterVolumeSpecName: "kube-api-access-rzq27") pod "1d9fcb2d-01f5-4d3b-8586-b0b700e68755" (UID: "1d9fcb2d-01f5-4d3b-8586-b0b700e68755"). InnerVolumeSpecName "kube-api-access-rzq27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.587229 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1d9fcb2d-01f5-4d3b-8586-b0b700e68755" (UID: "1d9fcb2d-01f5-4d3b-8586-b0b700e68755"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.608198 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d9fcb2d-01f5-4d3b-8586-b0b700e68755" (UID: "1d9fcb2d-01f5-4d3b-8586-b0b700e68755"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.633990 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-config-data" (OuterVolumeSpecName: "config-data") pod "1d9fcb2d-01f5-4d3b-8586-b0b700e68755" (UID: "1d9fcb2d-01f5-4d3b-8586-b0b700e68755"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.684888 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzq27\" (UniqueName: \"kubernetes.io/projected/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-kube-api-access-rzq27\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.684925 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.684940 4779 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:05 crc kubenswrapper[4779]: I0320 16:01:05.684951 4779 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1d9fcb2d-01f5-4d3b-8586-b0b700e68755-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:06 crc kubenswrapper[4779]: I0320 16:01:06.111701 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-mqqts" event={"ID":"1d9fcb2d-01f5-4d3b-8586-b0b700e68755","Type":"ContainerDied","Data":"8cfd12ad29e0f0e7e939dbd31ad34656a377da7521aa8dc61660d62e7fe4f0ed"} Mar 20 16:01:06 crc kubenswrapper[4779]: I0320 16:01:06.111749 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cfd12ad29e0f0e7e939dbd31ad34656a377da7521aa8dc61660d62e7fe4f0ed" Mar 20 16:01:06 crc kubenswrapper[4779]: I0320 16:01:06.111720 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-mqqts" Mar 20 16:01:06 crc kubenswrapper[4779]: I0320 16:01:06.113776 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5lvz" event={"ID":"da58aacb-8c26-4153-958f-7e841a281910","Type":"ContainerStarted","Data":"88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817"} Mar 20 16:01:06 crc kubenswrapper[4779]: I0320 16:01:06.144038 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q5lvz" podStartSLOduration=2.757436919 podStartE2EDuration="7.144014973s" podCreationTimestamp="2026-03-20 16:00:59 +0000 UTC" firstStartedPulling="2026-03-20 16:01:01.050022947 +0000 UTC m=+2278.012538747" lastFinishedPulling="2026-03-20 16:01:05.436601001 +0000 UTC m=+2282.399116801" observedRunningTime="2026-03-20 16:01:06.129441175 +0000 UTC m=+2283.091956975" watchObservedRunningTime="2026-03-20 16:01:06.144014973 +0000 UTC m=+2283.106530763" Mar 20 16:01:10 crc kubenswrapper[4779]: I0320 16:01:10.061292 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:01:10 crc kubenswrapper[4779]: I0320 16:01:10.061881 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:01:10 crc kubenswrapper[4779]: I0320 16:01:10.129330 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:01:10 crc kubenswrapper[4779]: I0320 16:01:10.190238 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:01:10 crc kubenswrapper[4779]: I0320 16:01:10.370425 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q5lvz"] Mar 20 16:01:11 crc kubenswrapper[4779]: I0320 16:01:11.154615 4779 generic.go:334] "Generic (PLEG): container finished" podID="9bbeb643-62cf-4487-b133-c1c618fe49d7" containerID="e85532eeda04d90918393acf00c4e58e812469daf0721444a662876b2cf317fc" exitCode=0 Mar 20 16:01:11 crc kubenswrapper[4779]: I0320 16:01:11.154837 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" event={"ID":"9bbeb643-62cf-4487-b133-c1c618fe49d7","Type":"ContainerDied","Data":"e85532eeda04d90918393acf00c4e58e812469daf0721444a662876b2cf317fc"} Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.162242 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q5lvz" podUID="da58aacb-8c26-4153-958f-7e841a281910" containerName="registry-server" containerID="cri-o://88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817" gracePeriod=2 Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.765384 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.771565 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934305 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-metadata-combined-ca-bundle\") pod \"9bbeb643-62cf-4487-b133-c1c618fe49d7\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934376 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9bbeb643-62cf-4487-b133-c1c618fe49d7\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934404 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-inventory\") pod \"9bbeb643-62cf-4487-b133-c1c618fe49d7\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934430 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-catalog-content\") pod \"da58aacb-8c26-4153-958f-7e841a281910\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934479 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-utilities\") pod \"da58aacb-8c26-4153-958f-7e841a281910\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934504 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-ssh-key-openstack-edpm-ipam\") pod \"9bbeb643-62cf-4487-b133-c1c618fe49d7\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934552 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpcxt\" (UniqueName: \"kubernetes.io/projected/9bbeb643-62cf-4487-b133-c1c618fe49d7-kube-api-access-zpcxt\") pod \"9bbeb643-62cf-4487-b133-c1c618fe49d7\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934569 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kpql\" (UniqueName: \"kubernetes.io/projected/da58aacb-8c26-4153-958f-7e841a281910-kube-api-access-7kpql\") pod \"da58aacb-8c26-4153-958f-7e841a281910\" (UID: \"da58aacb-8c26-4153-958f-7e841a281910\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.934664 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-nova-metadata-neutron-config-0\") pod \"9bbeb643-62cf-4487-b133-c1c618fe49d7\" (UID: \"9bbeb643-62cf-4487-b133-c1c618fe49d7\") " Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.935789 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-utilities" (OuterVolumeSpecName: "utilities") pod "da58aacb-8c26-4153-958f-7e841a281910" (UID: "da58aacb-8c26-4153-958f-7e841a281910"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.941033 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9bbeb643-62cf-4487-b133-c1c618fe49d7" (UID: "9bbeb643-62cf-4487-b133-c1c618fe49d7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.942211 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbeb643-62cf-4487-b133-c1c618fe49d7-kube-api-access-zpcxt" (OuterVolumeSpecName: "kube-api-access-zpcxt") pod "9bbeb643-62cf-4487-b133-c1c618fe49d7" (UID: "9bbeb643-62cf-4487-b133-c1c618fe49d7"). InnerVolumeSpecName "kube-api-access-zpcxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.942339 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da58aacb-8c26-4153-958f-7e841a281910-kube-api-access-7kpql" (OuterVolumeSpecName: "kube-api-access-7kpql") pod "da58aacb-8c26-4153-958f-7e841a281910" (UID: "da58aacb-8c26-4153-958f-7e841a281910"). InnerVolumeSpecName "kube-api-access-7kpql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.974349 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9bbeb643-62cf-4487-b133-c1c618fe49d7" (UID: "9bbeb643-62cf-4487-b133-c1c618fe49d7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.990934 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9bbeb643-62cf-4487-b133-c1c618fe49d7" (UID: "9bbeb643-62cf-4487-b133-c1c618fe49d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:12 crc kubenswrapper[4779]: I0320 16:01:12.991878 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-inventory" (OuterVolumeSpecName: "inventory") pod "9bbeb643-62cf-4487-b133-c1c618fe49d7" (UID: "9bbeb643-62cf-4487-b133-c1c618fe49d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.000893 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9bbeb643-62cf-4487-b133-c1c618fe49d7" (UID: "9bbeb643-62cf-4487-b133-c1c618fe49d7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.027138 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da58aacb-8c26-4153-958f-7e841a281910" (UID: "da58aacb-8c26-4153-958f-7e841a281910"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036872 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036909 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpcxt\" (UniqueName: \"kubernetes.io/projected/9bbeb643-62cf-4487-b133-c1c618fe49d7-kube-api-access-zpcxt\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036918 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kpql\" (UniqueName: \"kubernetes.io/projected/da58aacb-8c26-4153-958f-7e841a281910-kube-api-access-7kpql\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036928 4779 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036939 4779 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036952 4779 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036964 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bbeb643-62cf-4487-b133-c1c618fe49d7-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036974 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.036983 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da58aacb-8c26-4153-958f-7e841a281910-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.173627 4779 generic.go:334] "Generic (PLEG): container finished" podID="da58aacb-8c26-4153-958f-7e841a281910" containerID="88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817" exitCode=0 Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.173704 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5lvz" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.173713 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5lvz" event={"ID":"da58aacb-8c26-4153-958f-7e841a281910","Type":"ContainerDied","Data":"88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817"} Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.173775 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5lvz" event={"ID":"da58aacb-8c26-4153-958f-7e841a281910","Type":"ContainerDied","Data":"f4b325f02e0526b990e2784ca2c48763bcbc612f1ac252bbab371a33f014ef18"} Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.173793 4779 scope.go:117] "RemoveContainer" containerID="88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.175249 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" event={"ID":"9bbeb643-62cf-4487-b133-c1c618fe49d7","Type":"ContainerDied","Data":"51b4cc64d4064795da4d6908de1dc3c49ef5e362f3087cb77a3c31ee835b4bd8"} Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.175282 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b4cc64d4064795da4d6908de1dc3c49ef5e362f3087cb77a3c31ee835b4bd8" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.175346 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.195563 4779 scope.go:117] "RemoveContainer" containerID="1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.230989 4779 scope.go:117] "RemoveContainer" containerID="9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.251785 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q5lvz"] Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.261744 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q5lvz"] Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.263213 4779 scope.go:117] "RemoveContainer" containerID="88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817" Mar 20 16:01:13 crc kubenswrapper[4779]: E0320 16:01:13.263641 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817\": container with ID starting with 88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817 not found: ID does not exist" containerID="88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.263687 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817"} err="failed to get container status \"88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817\": rpc error: code = NotFound desc = could not find container \"88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817\": container with ID starting with 88fc7fe439a8bbca8cda3d67875b84d0f6dec4bbc79cd971164ba51273f52817 not found: ID does not exist" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.263717 4779 scope.go:117] "RemoveContainer" containerID="1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197" Mar 20 16:01:13 crc kubenswrapper[4779]: E0320 16:01:13.264041 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197\": container with ID starting with 1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197 not found: ID does not exist" containerID="1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.264068 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197"} err="failed to get container status \"1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197\": rpc error: code = NotFound desc = could not find container \"1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197\": container with ID starting with 1bb85fad133a0119998c74a35b51a2d9816bef98d397e8ea40cbc4896b739197 not found: ID does not exist" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.264091 4779 scope.go:117] "RemoveContainer" containerID="9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c" Mar 20 16:01:13 crc kubenswrapper[4779]: E0320 16:01:13.265371 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c\": container with ID starting with 9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c not found: ID does not exist" containerID="9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.265415 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c"} err="failed to get container status \"9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c\": rpc error: code = NotFound desc = could not find container \"9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c\": container with ID starting with 9cceea8c4c71f7dbed66e2f2665c4de3a67f40cfed93e49cb97b83dcf2f17c2c not found: ID does not exist" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.273987 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q"] Mar 20 16:01:13 crc kubenswrapper[4779]: E0320 16:01:13.274460 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbeb643-62cf-4487-b133-c1c618fe49d7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.274474 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbeb643-62cf-4487-b133-c1c618fe49d7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 16:01:13 crc kubenswrapper[4779]: E0320 16:01:13.274499 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da58aacb-8c26-4153-958f-7e841a281910" containerName="extract-content" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.274505 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="da58aacb-8c26-4153-958f-7e841a281910" containerName="extract-content" Mar 20 16:01:13 crc kubenswrapper[4779]: E0320 16:01:13.274525 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da58aacb-8c26-4153-958f-7e841a281910" containerName="extract-utilities" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.274531 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="da58aacb-8c26-4153-958f-7e841a281910" containerName="extract-utilities" Mar 20 16:01:13 crc kubenswrapper[4779]: E0320 16:01:13.274542 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9fcb2d-01f5-4d3b-8586-b0b700e68755" containerName="keystone-cron" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.274548 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9fcb2d-01f5-4d3b-8586-b0b700e68755" containerName="keystone-cron" Mar 20 16:01:13 crc kubenswrapper[4779]: E0320 16:01:13.274555 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da58aacb-8c26-4153-958f-7e841a281910" containerName="registry-server" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.274560 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="da58aacb-8c26-4153-958f-7e841a281910" containerName="registry-server" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.274739 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbeb643-62cf-4487-b133-c1c618fe49d7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.274756 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9fcb2d-01f5-4d3b-8586-b0b700e68755" containerName="keystone-cron" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.274770 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="da58aacb-8c26-4153-958f-7e841a281910" containerName="registry-server" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.275713 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.282414 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.282762 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.283019 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.283228 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.283413 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.285695 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q"] Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.443996 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdrlp\" (UniqueName: \"kubernetes.io/projected/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-kube-api-access-mdrlp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.444154 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.444181 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.444206 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.444464 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.546944 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdrlp\" (UniqueName: \"kubernetes.io/projected/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-kube-api-access-mdrlp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.547060 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.547086 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.547126 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.547193 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.550984 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.551170 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.552172 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.552897 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.573268 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdrlp\" (UniqueName: \"kubernetes.io/projected/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-kube-api-access-mdrlp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-44j5q\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.643651 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:01:13 crc kubenswrapper[4779]: I0320 16:01:13.831171 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da58aacb-8c26-4153-958f-7e841a281910" path="/var/lib/kubelet/pods/da58aacb-8c26-4153-958f-7e841a281910/volumes" Mar 20 16:01:14 crc kubenswrapper[4779]: I0320 16:01:14.147995 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q"] Mar 20 16:01:14 crc kubenswrapper[4779]: I0320 16:01:14.184375 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" event={"ID":"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1","Type":"ContainerStarted","Data":"243c29a75c707c1f482894086eb3270ebfd33a75f254b0a30a4e966fc733c0fa"} Mar 20 16:01:15 crc kubenswrapper[4779]: I0320 16:01:15.194974 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" event={"ID":"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1","Type":"ContainerStarted","Data":"be0be56802e95fe1a932b4c66847308fdc05a24826fd1e3977af13f39a8742bc"} Mar 20 16:01:15 crc kubenswrapper[4779]: I0320 16:01:15.220890 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" podStartSLOduration=1.769489324 podStartE2EDuration="2.220845541s" podCreationTimestamp="2026-03-20 16:01:13 +0000 UTC" firstStartedPulling="2026-03-20 16:01:14.155517223 +0000 UTC m=+2291.118033023" lastFinishedPulling="2026-03-20 16:01:14.60687344 +0000 UTC m=+2291.569389240" observedRunningTime="2026-03-20 16:01:15.220020641 +0000 UTC m=+2292.182536441" watchObservedRunningTime="2026-03-20 16:01:15.220845541 +0000 UTC m=+2292.183361381" Mar 20 16:01:25 crc kubenswrapper[4779]: I0320 16:01:25.149997 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:01:25 crc kubenswrapper[4779]: I0320 16:01:25.150624 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:01:55 crc kubenswrapper[4779]: I0320 16:01:55.150166 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:01:55 crc kubenswrapper[4779]: I0320 16:01:55.150762 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.153058 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567042-c2x9d"] Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.154849 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-c2x9d" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.158759 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.158856 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.158930 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.163799 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-c2x9d"] Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.265046 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pwvl\" (UniqueName: \"kubernetes.io/projected/36412fa2-e5a2-4a32-b747-696e33c68757-kube-api-access-4pwvl\") pod \"auto-csr-approver-29567042-c2x9d\" (UID: \"36412fa2-e5a2-4a32-b747-696e33c68757\") " pod="openshift-infra/auto-csr-approver-29567042-c2x9d" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.366747 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pwvl\" (UniqueName: \"kubernetes.io/projected/36412fa2-e5a2-4a32-b747-696e33c68757-kube-api-access-4pwvl\") pod \"auto-csr-approver-29567042-c2x9d\" (UID: \"36412fa2-e5a2-4a32-b747-696e33c68757\") " pod="openshift-infra/auto-csr-approver-29567042-c2x9d" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.385184 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pwvl\" (UniqueName: \"kubernetes.io/projected/36412fa2-e5a2-4a32-b747-696e33c68757-kube-api-access-4pwvl\") pod \"auto-csr-approver-29567042-c2x9d\" (UID: \"36412fa2-e5a2-4a32-b747-696e33c68757\") " pod="openshift-infra/auto-csr-approver-29567042-c2x9d" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.482885 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-c2x9d" Mar 20 16:02:00 crc kubenswrapper[4779]: I0320 16:02:00.906823 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-c2x9d"] Mar 20 16:02:01 crc kubenswrapper[4779]: I0320 16:02:01.589046 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-c2x9d" event={"ID":"36412fa2-e5a2-4a32-b747-696e33c68757","Type":"ContainerStarted","Data":"d8bf1c9eda3d5de381dd30da457d6c4b3f37ccab04b1f8a3cd46b93a09f14979"} Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.355187 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzrbj"] Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.357596 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.369752 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzrbj"] Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.404581 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-catalog-content\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.404693 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwmrh\" (UniqueName: \"kubernetes.io/projected/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-kube-api-access-kwmrh\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.404749 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-utilities\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.506740 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-utilities\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.506909 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-catalog-content\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.507008 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwmrh\" (UniqueName: \"kubernetes.io/projected/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-kube-api-access-kwmrh\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.507788 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-catalog-content\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.507867 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-utilities\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.528819 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwmrh\" (UniqueName: \"kubernetes.io/projected/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-kube-api-access-kwmrh\") pod \"redhat-marketplace-gzrbj\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.600654 4779 generic.go:334] "Generic (PLEG): container finished" podID="36412fa2-e5a2-4a32-b747-696e33c68757" containerID="8cf774985f35be3b9eceeba2338fbc6f1689bf2677ff39b30da453ff9ccafdab" exitCode=0 Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.600704 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-c2x9d" event={"ID":"36412fa2-e5a2-4a32-b747-696e33c68757","Type":"ContainerDied","Data":"8cf774985f35be3b9eceeba2338fbc6f1689bf2677ff39b30da453ff9ccafdab"} Mar 20 16:02:02 crc kubenswrapper[4779]: I0320 16:02:02.682966 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:03 crc kubenswrapper[4779]: I0320 16:02:03.801687 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzrbj"] Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.011595 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-c2x9d" Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.051001 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pwvl\" (UniqueName: \"kubernetes.io/projected/36412fa2-e5a2-4a32-b747-696e33c68757-kube-api-access-4pwvl\") pod \"36412fa2-e5a2-4a32-b747-696e33c68757\" (UID: \"36412fa2-e5a2-4a32-b747-696e33c68757\") " Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.056872 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36412fa2-e5a2-4a32-b747-696e33c68757-kube-api-access-4pwvl" (OuterVolumeSpecName: "kube-api-access-4pwvl") pod "36412fa2-e5a2-4a32-b747-696e33c68757" (UID: "36412fa2-e5a2-4a32-b747-696e33c68757"). InnerVolumeSpecName "kube-api-access-4pwvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.153481 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pwvl\" (UniqueName: \"kubernetes.io/projected/36412fa2-e5a2-4a32-b747-696e33c68757-kube-api-access-4pwvl\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.617312 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-c2x9d" event={"ID":"36412fa2-e5a2-4a32-b747-696e33c68757","Type":"ContainerDied","Data":"d8bf1c9eda3d5de381dd30da457d6c4b3f37ccab04b1f8a3cd46b93a09f14979"} Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.617359 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8bf1c9eda3d5de381dd30da457d6c4b3f37ccab04b1f8a3cd46b93a09f14979" Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.617366 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-c2x9d" Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.618947 4779 generic.go:334] "Generic (PLEG): container finished" podID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerID="1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca" exitCode=0 Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.618981 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzrbj" event={"ID":"2f1bf311-0657-40c0-89d0-4c1e4051a3fc","Type":"ContainerDied","Data":"1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca"} Mar 20 16:02:04 crc kubenswrapper[4779]: I0320 16:02:04.619002 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzrbj" event={"ID":"2f1bf311-0657-40c0-89d0-4c1e4051a3fc","Type":"ContainerStarted","Data":"9a70405de68e907eee61f5d6c5967f3e04fd75ad9b21d22fc35dff62f6f83596"} Mar 20 16:02:05 crc kubenswrapper[4779]: I0320 16:02:05.086228 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-m54tp"] Mar 20 16:02:05 crc kubenswrapper[4779]: I0320 16:02:05.098300 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-m54tp"] Mar 20 16:02:05 crc kubenswrapper[4779]: I0320 16:02:05.629144 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzrbj" event={"ID":"2f1bf311-0657-40c0-89d0-4c1e4051a3fc","Type":"ContainerStarted","Data":"627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365"} Mar 20 16:02:05 crc kubenswrapper[4779]: I0320 16:02:05.820450 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855d7064-da55-4450-8e6d-aced2cc5f7b2" path="/var/lib/kubelet/pods/855d7064-da55-4450-8e6d-aced2cc5f7b2/volumes" Mar 20 16:02:06 crc kubenswrapper[4779]: I0320 16:02:06.696620 4779 generic.go:334] "Generic (PLEG): container finished" podID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerID="627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365" exitCode=0 Mar 20 16:02:06 crc kubenswrapper[4779]: I0320 16:02:06.700235 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzrbj" event={"ID":"2f1bf311-0657-40c0-89d0-4c1e4051a3fc","Type":"ContainerDied","Data":"627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365"} Mar 20 16:02:07 crc kubenswrapper[4779]: I0320 16:02:07.718176 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzrbj" event={"ID":"2f1bf311-0657-40c0-89d0-4c1e4051a3fc","Type":"ContainerStarted","Data":"81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834"} Mar 20 16:02:07 crc kubenswrapper[4779]: I0320 16:02:07.736931 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzrbj" podStartSLOduration=3.009768447 podStartE2EDuration="5.736913202s" podCreationTimestamp="2026-03-20 16:02:02 +0000 UTC" firstStartedPulling="2026-03-20 16:02:04.621840663 +0000 UTC m=+2341.584356463" lastFinishedPulling="2026-03-20 16:02:07.348985408 +0000 UTC m=+2344.311501218" observedRunningTime="2026-03-20 16:02:07.73356996 +0000 UTC m=+2344.696085770" watchObservedRunningTime="2026-03-20 16:02:07.736913202 +0000 UTC m=+2344.699429002" Mar 20 16:02:12 crc kubenswrapper[4779]: I0320 16:02:12.683439 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:12 crc kubenswrapper[4779]: I0320 16:02:12.684100 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:12 crc kubenswrapper[4779]: I0320 16:02:12.738416 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:12 crc kubenswrapper[4779]: I0320 16:02:12.817225 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:12 crc kubenswrapper[4779]: I0320 16:02:12.974488 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzrbj"] Mar 20 16:02:14 crc kubenswrapper[4779]: I0320 16:02:14.779012 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzrbj" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerName="registry-server" containerID="cri-o://81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834" gracePeriod=2 Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.270746 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.392763 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-catalog-content\") pod \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.392828 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwmrh\" (UniqueName: \"kubernetes.io/projected/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-kube-api-access-kwmrh\") pod \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.393055 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-utilities\") pod \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\" (UID: \"2f1bf311-0657-40c0-89d0-4c1e4051a3fc\") " Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.394300 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-utilities" (OuterVolumeSpecName: "utilities") pod "2f1bf311-0657-40c0-89d0-4c1e4051a3fc" (UID: "2f1bf311-0657-40c0-89d0-4c1e4051a3fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.401274 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-kube-api-access-kwmrh" (OuterVolumeSpecName: "kube-api-access-kwmrh") pod "2f1bf311-0657-40c0-89d0-4c1e4051a3fc" (UID: "2f1bf311-0657-40c0-89d0-4c1e4051a3fc"). InnerVolumeSpecName "kube-api-access-kwmrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.424596 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f1bf311-0657-40c0-89d0-4c1e4051a3fc" (UID: "2f1bf311-0657-40c0-89d0-4c1e4051a3fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.495472 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.495516 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.495533 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwmrh\" (UniqueName: \"kubernetes.io/projected/2f1bf311-0657-40c0-89d0-4c1e4051a3fc-kube-api-access-kwmrh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.791457 4779 generic.go:334] "Generic (PLEG): container finished" podID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerID="81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834" exitCode=0 Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.791502 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzrbj" event={"ID":"2f1bf311-0657-40c0-89d0-4c1e4051a3fc","Type":"ContainerDied","Data":"81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834"} Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.791529 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzrbj" event={"ID":"2f1bf311-0657-40c0-89d0-4c1e4051a3fc","Type":"ContainerDied","Data":"9a70405de68e907eee61f5d6c5967f3e04fd75ad9b21d22fc35dff62f6f83596"} Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.791548 4779 scope.go:117] "RemoveContainer" containerID="81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.791734 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzrbj" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.882335 4779 scope.go:117] "RemoveContainer" containerID="627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365" Mar 20 16:02:15 crc kubenswrapper[4779]: I0320 16:02:15.997286 4779 scope.go:117] "RemoveContainer" containerID="1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca" Mar 20 16:02:16 crc kubenswrapper[4779]: I0320 16:02:16.010296 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzrbj"] Mar 20 16:02:16 crc kubenswrapper[4779]: I0320 16:02:16.016329 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzrbj"] Mar 20 16:02:16 crc kubenswrapper[4779]: I0320 16:02:16.046760 4779 scope.go:117] "RemoveContainer" containerID="81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834" Mar 20 16:02:16 crc kubenswrapper[4779]: E0320 16:02:16.052026 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834\": container with ID starting with 81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834 not found: ID does not exist" containerID="81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834" Mar 20 16:02:16 crc kubenswrapper[4779]: I0320 16:02:16.052396 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834"} err="failed to get container status \"81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834\": rpc error: code = NotFound desc = could not find container \"81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834\": container with ID starting with 81dbca41f00d19ead6ab45c248452d9d8c4f75ed63ef74f4752a32ed86dc3834 not found: ID does not exist" Mar 20 16:02:16 crc kubenswrapper[4779]: I0320 16:02:16.052494 4779 scope.go:117] "RemoveContainer" containerID="627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365" Mar 20 16:02:16 crc kubenswrapper[4779]: E0320 16:02:16.053037 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365\": container with ID starting with 627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365 not found: ID does not exist" containerID="627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365" Mar 20 16:02:16 crc kubenswrapper[4779]: I0320 16:02:16.053084 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365"} err="failed to get container status \"627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365\": rpc error: code = NotFound desc = could not find container \"627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365\": container with ID starting with 627f225115e8376ce71117932dcb59bf22edc1509c7a27882d2faf6e24729365 not found: ID does not exist" Mar 20 16:02:16 crc kubenswrapper[4779]: I0320 16:02:16.053134 4779 scope.go:117] "RemoveContainer" containerID="1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca" Mar 20 16:02:16 crc kubenswrapper[4779]: E0320 16:02:16.053568 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca\": container with ID starting with 1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca not found: ID does not exist" containerID="1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca" Mar 20 16:02:16 crc kubenswrapper[4779]: I0320 16:02:16.053651 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca"} err="failed to get container status \"1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca\": rpc error: code = NotFound desc = could not find container \"1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca\": container with ID starting with 1ac164fa7bc83617e6864fc30d3a712d419bb8bbf08e66587c0e4b368500aeca not found: ID does not exist" Mar 20 16:02:17 crc kubenswrapper[4779]: I0320 16:02:17.820298 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" path="/var/lib/kubelet/pods/2f1bf311-0657-40c0-89d0-4c1e4051a3fc/volumes" Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.149761 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.150781 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.150864 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.152558 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.152666 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" gracePeriod=600 Mar 20 16:02:25 crc kubenswrapper[4779]: E0320 16:02:25.276201 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.905535 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" exitCode=0 Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.905582 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311"} Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.905619 4779 scope.go:117] "RemoveContainer" containerID="56c8e1388500b86731bed3431c69ab3383dde78a7cf6321fabe80d4cca0e9447" Mar 20 16:02:25 crc kubenswrapper[4779]: I0320 16:02:25.906257 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:02:25 crc kubenswrapper[4779]: E0320 16:02:25.906494 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:02:29 crc kubenswrapper[4779]: I0320 16:02:29.973192 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bgfhz"] Mar 20 16:02:29 crc kubenswrapper[4779]: E0320 16:02:29.974204 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerName="extract-content" Mar 20 16:02:29 crc kubenswrapper[4779]: I0320 16:02:29.974216 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerName="extract-content" Mar 20 16:02:29 crc kubenswrapper[4779]: E0320 16:02:29.974228 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36412fa2-e5a2-4a32-b747-696e33c68757" containerName="oc" Mar 20 16:02:29 crc kubenswrapper[4779]: I0320 16:02:29.974234 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="36412fa2-e5a2-4a32-b747-696e33c68757" containerName="oc" Mar 20 16:02:29 crc kubenswrapper[4779]: E0320 16:02:29.974252 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerName="extract-utilities" Mar 20 16:02:29 crc kubenswrapper[4779]: I0320 16:02:29.974257 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerName="extract-utilities" Mar 20 16:02:29 crc kubenswrapper[4779]: E0320 16:02:29.974285 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerName="registry-server" Mar 20 16:02:29 crc kubenswrapper[4779]: I0320 16:02:29.974290 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerName="registry-server" Mar 20 16:02:29 crc kubenswrapper[4779]: I0320 16:02:29.974468 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="36412fa2-e5a2-4a32-b747-696e33c68757" containerName="oc" Mar 20 16:02:29 crc kubenswrapper[4779]: I0320 16:02:29.974480 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1bf311-0657-40c0-89d0-4c1e4051a3fc" containerName="registry-server" Mar 20 16:02:29 crc kubenswrapper[4779]: I0320 16:02:29.975864 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.001725 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bgfhz"] Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.096241 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-catalog-content\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.096300 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-utilities\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.096409 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74bbj\" (UniqueName: \"kubernetes.io/projected/b29f2771-69bb-4738-9f44-c2f23e9271bf-kube-api-access-74bbj\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.198558 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-catalog-content\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.198620 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-utilities\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.198728 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74bbj\" (UniqueName: \"kubernetes.io/projected/b29f2771-69bb-4738-9f44-c2f23e9271bf-kube-api-access-74bbj\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.199318 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-catalog-content\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.199359 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-utilities\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.218387 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74bbj\" (UniqueName: \"kubernetes.io/projected/b29f2771-69bb-4738-9f44-c2f23e9271bf-kube-api-access-74bbj\") pod \"certified-operators-bgfhz\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.342858 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.868188 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bgfhz"] Mar 20 16:02:30 crc kubenswrapper[4779]: I0320 16:02:30.949152 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgfhz" event={"ID":"b29f2771-69bb-4738-9f44-c2f23e9271bf","Type":"ContainerStarted","Data":"65a3627cff3ea6f92803c7282d2820318a7e5e50c6dca922f091d9edebd77067"} Mar 20 16:02:31 crc kubenswrapper[4779]: I0320 16:02:31.957418 4779 generic.go:334] "Generic (PLEG): container finished" podID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerID="542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6" exitCode=0 Mar 20 16:02:31 crc kubenswrapper[4779]: I0320 16:02:31.957529 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgfhz" event={"ID":"b29f2771-69bb-4738-9f44-c2f23e9271bf","Type":"ContainerDied","Data":"542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6"} Mar 20 16:02:32 crc kubenswrapper[4779]: I0320 16:02:32.984763 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgfhz" event={"ID":"b29f2771-69bb-4738-9f44-c2f23e9271bf","Type":"ContainerStarted","Data":"09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758"} Mar 20 16:02:34 crc kubenswrapper[4779]: I0320 16:02:34.000576 4779 generic.go:334] "Generic (PLEG): container finished" podID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerID="09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758" exitCode=0 Mar 20 16:02:34 crc kubenswrapper[4779]: I0320 16:02:34.000626 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgfhz" event={"ID":"b29f2771-69bb-4738-9f44-c2f23e9271bf","Type":"ContainerDied","Data":"09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758"} Mar 20 16:02:36 crc kubenswrapper[4779]: I0320 16:02:36.018161 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgfhz" event={"ID":"b29f2771-69bb-4738-9f44-c2f23e9271bf","Type":"ContainerStarted","Data":"148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac"} Mar 20 16:02:36 crc kubenswrapper[4779]: I0320 16:02:36.045746 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bgfhz" podStartSLOduration=4.577494382 podStartE2EDuration="7.04572825s" podCreationTimestamp="2026-03-20 16:02:29 +0000 UTC" firstStartedPulling="2026-03-20 16:02:31.960463287 +0000 UTC m=+2368.922979087" lastFinishedPulling="2026-03-20 16:02:34.428697155 +0000 UTC m=+2371.391212955" observedRunningTime="2026-03-20 16:02:36.038183223 +0000 UTC m=+2373.000699033" watchObservedRunningTime="2026-03-20 16:02:36.04572825 +0000 UTC m=+2373.008244050" Mar 20 16:02:37 crc kubenswrapper[4779]: I0320 16:02:37.809132 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:02:37 crc kubenswrapper[4779]: E0320 16:02:37.809725 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:02:40 crc kubenswrapper[4779]: I0320 16:02:40.343314 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:40 crc kubenswrapper[4779]: I0320 16:02:40.343808 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:40 crc kubenswrapper[4779]: I0320 16:02:40.388384 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:41 crc kubenswrapper[4779]: I0320 16:02:41.095427 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:41 crc kubenswrapper[4779]: I0320 16:02:41.140609 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bgfhz"] Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.069520 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bgfhz" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerName="registry-server" containerID="cri-o://148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac" gracePeriod=2 Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.539706 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.561658 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-utilities\") pod \"b29f2771-69bb-4738-9f44-c2f23e9271bf\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.561846 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74bbj\" (UniqueName: \"kubernetes.io/projected/b29f2771-69bb-4738-9f44-c2f23e9271bf-kube-api-access-74bbj\") pod \"b29f2771-69bb-4738-9f44-c2f23e9271bf\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.562000 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-catalog-content\") pod \"b29f2771-69bb-4738-9f44-c2f23e9271bf\" (UID: \"b29f2771-69bb-4738-9f44-c2f23e9271bf\") " Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.564850 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-utilities" (OuterVolumeSpecName: "utilities") pod "b29f2771-69bb-4738-9f44-c2f23e9271bf" (UID: "b29f2771-69bb-4738-9f44-c2f23e9271bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.569068 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29f2771-69bb-4738-9f44-c2f23e9271bf-kube-api-access-74bbj" (OuterVolumeSpecName: "kube-api-access-74bbj") pod "b29f2771-69bb-4738-9f44-c2f23e9271bf" (UID: "b29f2771-69bb-4738-9f44-c2f23e9271bf"). InnerVolumeSpecName "kube-api-access-74bbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.664462 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b29f2771-69bb-4738-9f44-c2f23e9271bf" (UID: "b29f2771-69bb-4738-9f44-c2f23e9271bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.664555 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.664584 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74bbj\" (UniqueName: \"kubernetes.io/projected/b29f2771-69bb-4738-9f44-c2f23e9271bf-kube-api-access-74bbj\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.681578 4779 scope.go:117] "RemoveContainer" containerID="0120c6bb676edc98ec75f49a23ea355ef6a971b7bb06d5c0e9e8a79cf92f232e" Mar 20 16:02:43 crc kubenswrapper[4779]: I0320 16:02:43.766179 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29f2771-69bb-4738-9f44-c2f23e9271bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.090887 4779 generic.go:334] "Generic (PLEG): container finished" podID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerID="148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac" exitCode=0 Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.090936 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgfhz" event={"ID":"b29f2771-69bb-4738-9f44-c2f23e9271bf","Type":"ContainerDied","Data":"148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac"} Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.090970 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgfhz" event={"ID":"b29f2771-69bb-4738-9f44-c2f23e9271bf","Type":"ContainerDied","Data":"65a3627cff3ea6f92803c7282d2820318a7e5e50c6dca922f091d9edebd77067"} Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.090990 4779 scope.go:117] "RemoveContainer" containerID="148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.090983 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgfhz" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.116317 4779 scope.go:117] "RemoveContainer" containerID="09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.126175 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bgfhz"] Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.136575 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bgfhz"] Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.138532 4779 scope.go:117] "RemoveContainer" containerID="542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.159718 4779 scope.go:117] "RemoveContainer" containerID="148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac" Mar 20 16:02:44 crc kubenswrapper[4779]: E0320 16:02:44.160737 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac\": container with ID starting with 148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac not found: ID does not exist" containerID="148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.160768 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac"} err="failed to get container status \"148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac\": rpc error: code = NotFound desc = could not find container \"148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac\": container with ID starting with 148e89ee02317201bd3510e7ccfdaa659d586b8c692450c640071b4659efebac not found: ID does not exist" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.160791 4779 scope.go:117] "RemoveContainer" containerID="09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758" Mar 20 16:02:44 crc kubenswrapper[4779]: E0320 16:02:44.160988 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758\": container with ID starting with 09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758 not found: ID does not exist" containerID="09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.161007 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758"} err="failed to get container status \"09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758\": rpc error: code = NotFound desc = could not find container \"09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758\": container with ID starting with 09b6cf7b0ac7fdf0c5e474ddf41edf01b73079c89134e27c69e44f79d76d6758 not found: ID does not exist" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.161018 4779 scope.go:117] "RemoveContainer" containerID="542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6" Mar 20 16:02:44 crc kubenswrapper[4779]: E0320 16:02:44.161228 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6\": container with ID starting with 542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6 not found: ID does not exist" containerID="542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6" Mar 20 16:02:44 crc kubenswrapper[4779]: I0320 16:02:44.161259 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6"} err="failed to get container status \"542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6\": rpc error: code = NotFound desc = could not find container \"542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6\": container with ID starting with 542e7e20e4e719246f4b19d334c6625238e313cf097d5381695f2375f533f7a6 not found: ID does not exist" Mar 20 16:02:45 crc kubenswrapper[4779]: I0320 16:02:45.820216 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" path="/var/lib/kubelet/pods/b29f2771-69bb-4738-9f44-c2f23e9271bf/volumes" Mar 20 16:02:48 crc kubenswrapper[4779]: I0320 16:02:48.809356 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:02:48 crc kubenswrapper[4779]: E0320 16:02:48.810277 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:03:03 crc kubenswrapper[4779]: I0320 16:03:03.814349 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:03:03 crc kubenswrapper[4779]: E0320 16:03:03.815317 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:03:17 crc kubenswrapper[4779]: I0320 16:03:17.809044 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:03:17 crc kubenswrapper[4779]: E0320 16:03:17.809893 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:03:29 crc kubenswrapper[4779]: I0320 16:03:29.809382 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:03:29 crc kubenswrapper[4779]: E0320 16:03:29.810192 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:03:44 crc kubenswrapper[4779]: I0320 16:03:44.809655 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:03:44 crc kubenswrapper[4779]: E0320 16:03:44.810575 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:03:56 crc kubenswrapper[4779]: I0320 16:03:56.809618 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:03:56 crc kubenswrapper[4779]: E0320 16:03:56.810451 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.145603 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nvf8z"] Mar 20 16:04:00 crc kubenswrapper[4779]: E0320 16:04:00.148056 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerName="extract-content" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.148157 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerName="extract-content" Mar 20 16:04:00 crc kubenswrapper[4779]: E0320 16:04:00.148235 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerName="extract-utilities" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.148285 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerName="extract-utilities" Mar 20 16:04:00 crc kubenswrapper[4779]: E0320 16:04:00.148348 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerName="registry-server" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.148396 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerName="registry-server" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.148646 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29f2771-69bb-4738-9f44-c2f23e9271bf" containerName="registry-server" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.149390 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-nvf8z" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.151852 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.152061 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.152349 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.163607 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nvf8z"] Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.219608 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgvxn\" (UniqueName: \"kubernetes.io/projected/93564d6a-746c-44d4-b947-1294540a732c-kube-api-access-sgvxn\") pod \"auto-csr-approver-29567044-nvf8z\" (UID: \"93564d6a-746c-44d4-b947-1294540a732c\") " pod="openshift-infra/auto-csr-approver-29567044-nvf8z" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.322894 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgvxn\" (UniqueName: \"kubernetes.io/projected/93564d6a-746c-44d4-b947-1294540a732c-kube-api-access-sgvxn\") pod \"auto-csr-approver-29567044-nvf8z\" (UID: \"93564d6a-746c-44d4-b947-1294540a732c\") " pod="openshift-infra/auto-csr-approver-29567044-nvf8z" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.351440 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgvxn\" (UniqueName: \"kubernetes.io/projected/93564d6a-746c-44d4-b947-1294540a732c-kube-api-access-sgvxn\") pod \"auto-csr-approver-29567044-nvf8z\" (UID: \"93564d6a-746c-44d4-b947-1294540a732c\") " pod="openshift-infra/auto-csr-approver-29567044-nvf8z" Mar 20 16:04:00 crc kubenswrapper[4779]: I0320 16:04:00.473961 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-nvf8z" Mar 20 16:04:01 crc kubenswrapper[4779]: I0320 16:04:00.999739 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nvf8z"] Mar 20 16:04:01 crc kubenswrapper[4779]: I0320 16:04:01.002082 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:04:01 crc kubenswrapper[4779]: I0320 16:04:01.834912 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-nvf8z" event={"ID":"93564d6a-746c-44d4-b947-1294540a732c","Type":"ContainerStarted","Data":"5fcf32a8452e4912dff39aacea42ecbdc04a4926cd1ebdfbccaf2c3aa7eef31f"} Mar 20 16:04:02 crc kubenswrapper[4779]: I0320 16:04:02.846838 4779 generic.go:334] "Generic (PLEG): container finished" podID="93564d6a-746c-44d4-b947-1294540a732c" containerID="003f8a90c7e133e93d79e6ec523e2795616db6f4b954dc2dcc75388bbca63728" exitCode=0 Mar 20 16:04:02 crc kubenswrapper[4779]: I0320 16:04:02.846967 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-nvf8z" event={"ID":"93564d6a-746c-44d4-b947-1294540a732c","Type":"ContainerDied","Data":"003f8a90c7e133e93d79e6ec523e2795616db6f4b954dc2dcc75388bbca63728"} Mar 20 16:04:04 crc kubenswrapper[4779]: I0320 16:04:04.227188 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-nvf8z" Mar 20 16:04:04 crc kubenswrapper[4779]: I0320 16:04:04.299998 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgvxn\" (UniqueName: \"kubernetes.io/projected/93564d6a-746c-44d4-b947-1294540a732c-kube-api-access-sgvxn\") pod \"93564d6a-746c-44d4-b947-1294540a732c\" (UID: \"93564d6a-746c-44d4-b947-1294540a732c\") " Mar 20 16:04:04 crc kubenswrapper[4779]: I0320 16:04:04.308415 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93564d6a-746c-44d4-b947-1294540a732c-kube-api-access-sgvxn" (OuterVolumeSpecName: "kube-api-access-sgvxn") pod "93564d6a-746c-44d4-b947-1294540a732c" (UID: "93564d6a-746c-44d4-b947-1294540a732c"). InnerVolumeSpecName "kube-api-access-sgvxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:04 crc kubenswrapper[4779]: I0320 16:04:04.402338 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgvxn\" (UniqueName: \"kubernetes.io/projected/93564d6a-746c-44d4-b947-1294540a732c-kube-api-access-sgvxn\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:04 crc kubenswrapper[4779]: I0320 16:04:04.863795 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-nvf8z" event={"ID":"93564d6a-746c-44d4-b947-1294540a732c","Type":"ContainerDied","Data":"5fcf32a8452e4912dff39aacea42ecbdc04a4926cd1ebdfbccaf2c3aa7eef31f"} Mar 20 16:04:04 crc kubenswrapper[4779]: I0320 16:04:04.864271 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fcf32a8452e4912dff39aacea42ecbdc04a4926cd1ebdfbccaf2c3aa7eef31f" Mar 20 16:04:04 crc kubenswrapper[4779]: I0320 16:04:04.863823 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-nvf8z" Mar 20 16:04:05 crc kubenswrapper[4779]: I0320 16:04:05.305527 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-hpjgt"] Mar 20 16:04:05 crc kubenswrapper[4779]: I0320 16:04:05.316042 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-hpjgt"] Mar 20 16:04:05 crc kubenswrapper[4779]: I0320 16:04:05.838002 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="467d5500-9158-43cd-a096-986a0eb3319a" path="/var/lib/kubelet/pods/467d5500-9158-43cd-a096-986a0eb3319a/volumes" Mar 20 16:04:11 crc kubenswrapper[4779]: I0320 16:04:11.808847 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:04:11 crc kubenswrapper[4779]: E0320 16:04:11.809795 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:04:25 crc kubenswrapper[4779]: I0320 16:04:25.808400 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:04:25 crc kubenswrapper[4779]: E0320 16:04:25.809258 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:04:37 crc kubenswrapper[4779]: I0320 16:04:37.808934 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:04:37 crc kubenswrapper[4779]: E0320 16:04:37.809809 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:04:43 crc kubenswrapper[4779]: I0320 16:04:43.794313 4779 scope.go:117] "RemoveContainer" containerID="cd409a584cc4406249ba8623c15bbfee8fafd2141218e43845b7fa65ee7b789f" Mar 20 16:04:50 crc kubenswrapper[4779]: I0320 16:04:50.808601 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:04:50 crc kubenswrapper[4779]: E0320 16:04:50.809546 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:05:02 crc kubenswrapper[4779]: I0320 16:05:02.365246 4779 generic.go:334] "Generic (PLEG): container finished" podID="6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" containerID="be0be56802e95fe1a932b4c66847308fdc05a24826fd1e3977af13f39a8742bc" exitCode=0 Mar 20 16:05:02 crc kubenswrapper[4779]: I0320 16:05:02.365349 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" event={"ID":"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1","Type":"ContainerDied","Data":"be0be56802e95fe1a932b4c66847308fdc05a24826fd1e3977af13f39a8742bc"} Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.804371 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.949772 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-combined-ca-bundle\") pod \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.949848 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdrlp\" (UniqueName: \"kubernetes.io/projected/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-kube-api-access-mdrlp\") pod \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.949925 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-inventory\") pod \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.950166 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-ssh-key-openstack-edpm-ipam\") pod \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.950223 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-secret-0\") pod \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\" (UID: \"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1\") " Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.963494 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-kube-api-access-mdrlp" (OuterVolumeSpecName: "kube-api-access-mdrlp") pod "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" (UID: "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1"). InnerVolumeSpecName "kube-api-access-mdrlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.964062 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" (UID: "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.982629 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" (UID: "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.982665 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-inventory" (OuterVolumeSpecName: "inventory") pod "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" (UID: "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:03 crc kubenswrapper[4779]: I0320 16:05:03.983648 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" (UID: "6e8b60eb-d9f8-4956-91eb-9d0b760f4df1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.052119 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.052159 4779 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.052169 4779 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.052177 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdrlp\" (UniqueName: \"kubernetes.io/projected/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-kube-api-access-mdrlp\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.052185 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e8b60eb-d9f8-4956-91eb-9d0b760f4df1-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.384727 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" event={"ID":"6e8b60eb-d9f8-4956-91eb-9d0b760f4df1","Type":"ContainerDied","Data":"243c29a75c707c1f482894086eb3270ebfd33a75f254b0a30a4e966fc733c0fa"} Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.384766 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="243c29a75c707c1f482894086eb3270ebfd33a75f254b0a30a4e966fc733c0fa" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.384804 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-44j5q" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.489439 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd"] Mar 20 16:05:04 crc kubenswrapper[4779]: E0320 16:05:04.489896 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93564d6a-746c-44d4-b947-1294540a732c" containerName="oc" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.489916 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="93564d6a-746c-44d4-b947-1294540a732c" containerName="oc" Mar 20 16:05:04 crc kubenswrapper[4779]: E0320 16:05:04.489929 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.489939 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.490249 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="93564d6a-746c-44d4-b947-1294540a732c" containerName="oc" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.490292 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e8b60eb-d9f8-4956-91eb-9d0b760f4df1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.491129 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.493492 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.495933 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.496222 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.496354 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.496485 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.496590 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.496746 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.506620 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd"] Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.562713 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.562806 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.562840 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.562877 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.562897 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.562956 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.563038 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.563090 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.563170 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.563193 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.563242 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l57mh\" (UniqueName: \"kubernetes.io/projected/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-kube-api-access-l57mh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665342 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665388 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665420 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l57mh\" (UniqueName: \"kubernetes.io/projected/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-kube-api-access-l57mh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665453 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665499 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665527 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665561 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665583 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665602 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665624 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.665643 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.667447 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.669327 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.669762 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.669968 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.670801 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.672227 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.673239 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.675708 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.677389 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.682638 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.691783 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l57mh\" (UniqueName: \"kubernetes.io/projected/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-kube-api-access-l57mh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tdlvd\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:04 crc kubenswrapper[4779]: I0320 16:05:04.806248 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:05:05 crc kubenswrapper[4779]: I0320 16:05:05.353844 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd"] Mar 20 16:05:05 crc kubenswrapper[4779]: I0320 16:05:05.399628 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" event={"ID":"69a85922-6ad1-4e70-bbc9-18a0cdc178f1","Type":"ContainerStarted","Data":"38d902b19410ab52da4ee44c5e21071fad6e6ba3baa3f35a6fa74d4ea84f0e0a"} Mar 20 16:05:05 crc kubenswrapper[4779]: I0320 16:05:05.816778 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:05:05 crc kubenswrapper[4779]: E0320 16:05:05.817014 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:05:06 crc kubenswrapper[4779]: I0320 16:05:06.409368 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" event={"ID":"69a85922-6ad1-4e70-bbc9-18a0cdc178f1","Type":"ContainerStarted","Data":"a46fe7ed84e1b47d98ed4828d725ef47a472d8ae9dc9258c61f65a59868be016"} Mar 20 16:05:06 crc kubenswrapper[4779]: I0320 16:05:06.435024 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" podStartSLOduration=1.912414651 podStartE2EDuration="2.434999821s" podCreationTimestamp="2026-03-20 16:05:04 +0000 UTC" firstStartedPulling="2026-03-20 16:05:05.362550349 +0000 UTC m=+2522.325066149" lastFinishedPulling="2026-03-20 16:05:05.885135519 +0000 UTC m=+2522.847651319" observedRunningTime="2026-03-20 16:05:06.424857301 +0000 UTC m=+2523.387373101" watchObservedRunningTime="2026-03-20 16:05:06.434999821 +0000 UTC m=+2523.397515621" Mar 20 16:05:16 crc kubenswrapper[4779]: I0320 16:05:16.808782 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:05:16 crc kubenswrapper[4779]: E0320 16:05:16.809615 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:05:28 crc kubenswrapper[4779]: I0320 16:05:28.809228 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:05:28 crc kubenswrapper[4779]: E0320 16:05:28.810577 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:05:41 crc kubenswrapper[4779]: I0320 16:05:41.810048 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:05:41 crc kubenswrapper[4779]: E0320 16:05:41.810950 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:05:54 crc kubenswrapper[4779]: I0320 16:05:54.809091 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:05:54 crc kubenswrapper[4779]: E0320 16:05:54.809907 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.146810 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567046-wc6s8"] Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.148513 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-wc6s8" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.152805 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.152821 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.153087 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.156621 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kbz6\" (UniqueName: \"kubernetes.io/projected/f4aa484e-e617-4cc7-a213-995589465356-kube-api-access-9kbz6\") pod \"auto-csr-approver-29567046-wc6s8\" (UID: \"f4aa484e-e617-4cc7-a213-995589465356\") " pod="openshift-infra/auto-csr-approver-29567046-wc6s8" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.166246 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-wc6s8"] Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.257959 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kbz6\" (UniqueName: \"kubernetes.io/projected/f4aa484e-e617-4cc7-a213-995589465356-kube-api-access-9kbz6\") pod \"auto-csr-approver-29567046-wc6s8\" (UID: \"f4aa484e-e617-4cc7-a213-995589465356\") " pod="openshift-infra/auto-csr-approver-29567046-wc6s8" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.280861 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kbz6\" (UniqueName: \"kubernetes.io/projected/f4aa484e-e617-4cc7-a213-995589465356-kube-api-access-9kbz6\") pod \"auto-csr-approver-29567046-wc6s8\" (UID: \"f4aa484e-e617-4cc7-a213-995589465356\") " pod="openshift-infra/auto-csr-approver-29567046-wc6s8" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.466918 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-wc6s8" Mar 20 16:06:00 crc kubenswrapper[4779]: I0320 16:06:00.881919 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-wc6s8"] Mar 20 16:06:01 crc kubenswrapper[4779]: I0320 16:06:01.878989 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-wc6s8" event={"ID":"f4aa484e-e617-4cc7-a213-995589465356","Type":"ContainerStarted","Data":"6f80a3a292da498b82da54582d9681392e7c790164f12637398af31d0b08ec84"} Mar 20 16:06:02 crc kubenswrapper[4779]: I0320 16:06:02.890744 4779 generic.go:334] "Generic (PLEG): container finished" podID="f4aa484e-e617-4cc7-a213-995589465356" containerID="ffbda15062f12cbed332b3e43304b77f1e62fcf681c66aa037ba74e64f8242c0" exitCode=0 Mar 20 16:06:02 crc kubenswrapper[4779]: I0320 16:06:02.890813 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-wc6s8" event={"ID":"f4aa484e-e617-4cc7-a213-995589465356","Type":"ContainerDied","Data":"ffbda15062f12cbed332b3e43304b77f1e62fcf681c66aa037ba74e64f8242c0"} Mar 20 16:06:04 crc kubenswrapper[4779]: I0320 16:06:04.173767 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-wc6s8" Mar 20 16:06:04 crc kubenswrapper[4779]: I0320 16:06:04.333216 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kbz6\" (UniqueName: \"kubernetes.io/projected/f4aa484e-e617-4cc7-a213-995589465356-kube-api-access-9kbz6\") pod \"f4aa484e-e617-4cc7-a213-995589465356\" (UID: \"f4aa484e-e617-4cc7-a213-995589465356\") " Mar 20 16:06:04 crc kubenswrapper[4779]: I0320 16:06:04.338472 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4aa484e-e617-4cc7-a213-995589465356-kube-api-access-9kbz6" (OuterVolumeSpecName: "kube-api-access-9kbz6") pod "f4aa484e-e617-4cc7-a213-995589465356" (UID: "f4aa484e-e617-4cc7-a213-995589465356"). InnerVolumeSpecName "kube-api-access-9kbz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:04 crc kubenswrapper[4779]: I0320 16:06:04.435206 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kbz6\" (UniqueName: \"kubernetes.io/projected/f4aa484e-e617-4cc7-a213-995589465356-kube-api-access-9kbz6\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:04 crc kubenswrapper[4779]: I0320 16:06:04.909579 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-wc6s8" event={"ID":"f4aa484e-e617-4cc7-a213-995589465356","Type":"ContainerDied","Data":"6f80a3a292da498b82da54582d9681392e7c790164f12637398af31d0b08ec84"} Mar 20 16:06:04 crc kubenswrapper[4779]: I0320 16:06:04.909628 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-wc6s8" Mar 20 16:06:04 crc kubenswrapper[4779]: I0320 16:06:04.909981 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f80a3a292da498b82da54582d9681392e7c790164f12637398af31d0b08ec84" Mar 20 16:06:05 crc kubenswrapper[4779]: I0320 16:06:05.245893 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-99p54"] Mar 20 16:06:05 crc kubenswrapper[4779]: I0320 16:06:05.254049 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-99p54"] Mar 20 16:06:05 crc kubenswrapper[4779]: I0320 16:06:05.821409 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70aa1a9-d9ff-4845-a8f9-26cbcadb893e" path="/var/lib/kubelet/pods/c70aa1a9-d9ff-4845-a8f9-26cbcadb893e/volumes" Mar 20 16:06:07 crc kubenswrapper[4779]: I0320 16:06:07.808477 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:06:07 crc kubenswrapper[4779]: E0320 16:06:07.808997 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:06:21 crc kubenswrapper[4779]: I0320 16:06:21.809736 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:06:21 crc kubenswrapper[4779]: E0320 16:06:21.810797 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:06:34 crc kubenswrapper[4779]: I0320 16:06:34.809091 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:06:34 crc kubenswrapper[4779]: E0320 16:06:34.809948 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:06:43 crc kubenswrapper[4779]: I0320 16:06:43.898075 4779 scope.go:117] "RemoveContainer" containerID="31cfa481a64db61aa9d8823dd7f5424c7de78ca5d19d859a3e97e6c7f7adc8b7" Mar 20 16:06:48 crc kubenswrapper[4779]: I0320 16:06:48.809276 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:06:48 crc kubenswrapper[4779]: E0320 16:06:48.809974 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:06:59 crc kubenswrapper[4779]: I0320 16:06:59.809264 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:06:59 crc kubenswrapper[4779]: E0320 16:06:59.810142 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:07:11 crc kubenswrapper[4779]: I0320 16:07:11.809154 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:07:11 crc kubenswrapper[4779]: E0320 16:07:11.809931 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:07:23 crc kubenswrapper[4779]: I0320 16:07:23.820371 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:07:23 crc kubenswrapper[4779]: E0320 16:07:23.821181 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:07:24 crc kubenswrapper[4779]: I0320 16:07:24.761199 4779 generic.go:334] "Generic (PLEG): container finished" podID="69a85922-6ad1-4e70-bbc9-18a0cdc178f1" containerID="a46fe7ed84e1b47d98ed4828d725ef47a472d8ae9dc9258c61f65a59868be016" exitCode=0 Mar 20 16:07:24 crc kubenswrapper[4779]: I0320 16:07:24.761329 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" event={"ID":"69a85922-6ad1-4e70-bbc9-18a0cdc178f1","Type":"ContainerDied","Data":"a46fe7ed84e1b47d98ed4828d725ef47a472d8ae9dc9258c61f65a59868be016"} Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.358661 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.448877 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l57mh\" (UniqueName: \"kubernetes.io/projected/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-kube-api-access-l57mh\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.448940 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-0\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449003 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-inventory\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449037 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-combined-ca-bundle\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449074 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-extra-config-0\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449093 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-ssh-key-openstack-edpm-ipam\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449145 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-0\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449189 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-1\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449289 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-3\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449323 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-2\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.449346 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-1\") pod \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\" (UID: \"69a85922-6ad1-4e70-bbc9-18a0cdc178f1\") " Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.454258 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-kube-api-access-l57mh" (OuterVolumeSpecName: "kube-api-access-l57mh") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "kube-api-access-l57mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.454710 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.490079 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.491539 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.491717 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.497206 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.498371 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.500258 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.501458 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-inventory" (OuterVolumeSpecName: "inventory") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.517511 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.520735 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "69a85922-6ad1-4e70-bbc9-18a0cdc178f1" (UID: "69a85922-6ad1-4e70-bbc9-18a0cdc178f1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551355 4779 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551393 4779 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551404 4779 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551414 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l57mh\" (UniqueName: \"kubernetes.io/projected/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-kube-api-access-l57mh\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551422 4779 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551432 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551494 4779 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551504 4779 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551514 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551522 4779 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.551550 4779 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69a85922-6ad1-4e70-bbc9-18a0cdc178f1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.779898 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" event={"ID":"69a85922-6ad1-4e70-bbc9-18a0cdc178f1","Type":"ContainerDied","Data":"38d902b19410ab52da4ee44c5e21071fad6e6ba3baa3f35a6fa74d4ea84f0e0a"} Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.780257 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38d902b19410ab52da4ee44c5e21071fad6e6ba3baa3f35a6fa74d4ea84f0e0a" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.779970 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tdlvd" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.888663 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c"] Mar 20 16:07:26 crc kubenswrapper[4779]: E0320 16:07:26.889067 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a85922-6ad1-4e70-bbc9-18a0cdc178f1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.889084 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a85922-6ad1-4e70-bbc9-18a0cdc178f1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 16:07:26 crc kubenswrapper[4779]: E0320 16:07:26.889138 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4aa484e-e617-4cc7-a213-995589465356" containerName="oc" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.889147 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4aa484e-e617-4cc7-a213-995589465356" containerName="oc" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.889308 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a85922-6ad1-4e70-bbc9-18a0cdc178f1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.889324 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4aa484e-e617-4cc7-a213-995589465356" containerName="oc" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.889981 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.891708 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.892140 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.893555 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.893594 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r7sbf" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.893674 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.905272 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c"] Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.958754 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.958825 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.958899 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2sbn\" (UniqueName: \"kubernetes.io/projected/8e63bf33-6a78-4424-97d5-7eed781817d1-kube-api-access-k2sbn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.958975 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.959000 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.959030 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:26 crc kubenswrapper[4779]: I0320 16:07:26.959075 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.060840 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.060886 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.060921 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2sbn\" (UniqueName: \"kubernetes.io/projected/8e63bf33-6a78-4424-97d5-7eed781817d1-kube-api-access-k2sbn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.060967 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.060984 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.061003 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.061631 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.065982 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.066508 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.066667 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.067747 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.067818 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.067829 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.080521 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2sbn\" (UniqueName: \"kubernetes.io/projected/8e63bf33-6a78-4424-97d5-7eed781817d1-kube-api-access-k2sbn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n27c\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.207434 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.733561 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c"] Mar 20 16:07:27 crc kubenswrapper[4779]: I0320 16:07:27.789493 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" event={"ID":"8e63bf33-6a78-4424-97d5-7eed781817d1","Type":"ContainerStarted","Data":"f063fa63b2c93108c01397d15fa4cbff0203bf2e31e4e09e2f029536eb8cf612"} Mar 20 16:07:28 crc kubenswrapper[4779]: I0320 16:07:28.812148 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" event={"ID":"8e63bf33-6a78-4424-97d5-7eed781817d1","Type":"ContainerStarted","Data":"ddc55cec3cef1e29e93b3b6c33bd5eb69ef1f551e6b9669319e6acc00b422675"} Mar 20 16:07:29 crc kubenswrapper[4779]: I0320 16:07:29.848849 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" podStartSLOduration=3.115736159 podStartE2EDuration="3.848833077s" podCreationTimestamp="2026-03-20 16:07:26 +0000 UTC" firstStartedPulling="2026-03-20 16:07:27.736953451 +0000 UTC m=+2664.699469251" lastFinishedPulling="2026-03-20 16:07:28.470050369 +0000 UTC m=+2665.432566169" observedRunningTime="2026-03-20 16:07:29.839147439 +0000 UTC m=+2666.801663239" watchObservedRunningTime="2026-03-20 16:07:29.848833077 +0000 UTC m=+2666.811348877" Mar 20 16:07:37 crc kubenswrapper[4779]: I0320 16:07:37.808810 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:07:38 crc kubenswrapper[4779]: I0320 16:07:38.905363 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"22009c39699cf0c6f0f90ca317132c546c514865bbbd9b00aca011844a148a84"} Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.152164 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567048-7ms57"] Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.155468 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-7ms57" Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.157753 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.157776 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.159249 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.165528 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-7ms57"] Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.278554 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6mh2\" (UniqueName: \"kubernetes.io/projected/50f0a1ab-acea-4e33-820d-ff0e501ce17a-kube-api-access-h6mh2\") pod \"auto-csr-approver-29567048-7ms57\" (UID: \"50f0a1ab-acea-4e33-820d-ff0e501ce17a\") " pod="openshift-infra/auto-csr-approver-29567048-7ms57" Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.380378 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mh2\" (UniqueName: \"kubernetes.io/projected/50f0a1ab-acea-4e33-820d-ff0e501ce17a-kube-api-access-h6mh2\") pod \"auto-csr-approver-29567048-7ms57\" (UID: \"50f0a1ab-acea-4e33-820d-ff0e501ce17a\") " pod="openshift-infra/auto-csr-approver-29567048-7ms57" Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.399979 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mh2\" (UniqueName: \"kubernetes.io/projected/50f0a1ab-acea-4e33-820d-ff0e501ce17a-kube-api-access-h6mh2\") pod \"auto-csr-approver-29567048-7ms57\" (UID: \"50f0a1ab-acea-4e33-820d-ff0e501ce17a\") " pod="openshift-infra/auto-csr-approver-29567048-7ms57" Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.477480 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-7ms57" Mar 20 16:08:00 crc kubenswrapper[4779]: I0320 16:08:00.931966 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-7ms57"] Mar 20 16:08:01 crc kubenswrapper[4779]: I0320 16:08:01.094786 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-7ms57" event={"ID":"50f0a1ab-acea-4e33-820d-ff0e501ce17a","Type":"ContainerStarted","Data":"0e21ff3d085696e0d36f05ef3eebfda54287fe4ea2b4a3e9e37a820c51793ad6"} Mar 20 16:08:01 crc kubenswrapper[4779]: I0320 16:08:01.964859 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g7s56"] Mar 20 16:08:01 crc kubenswrapper[4779]: I0320 16:08:01.967926 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.000746 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7s56"] Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.115934 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-utilities\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.116013 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-catalog-content\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.116092 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wnkt\" (UniqueName: \"kubernetes.io/projected/3533115d-34f9-4980-aab2-75964b9a8b8f-kube-api-access-2wnkt\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.217726 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-utilities\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.217805 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-catalog-content\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.217866 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wnkt\" (UniqueName: \"kubernetes.io/projected/3533115d-34f9-4980-aab2-75964b9a8b8f-kube-api-access-2wnkt\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.218484 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-catalog-content\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.218486 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-utilities\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.253258 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wnkt\" (UniqueName: \"kubernetes.io/projected/3533115d-34f9-4980-aab2-75964b9a8b8f-kube-api-access-2wnkt\") pod \"redhat-operators-g7s56\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.330818 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:02 crc kubenswrapper[4779]: I0320 16:08:02.850887 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7s56"] Mar 20 16:08:03 crc kubenswrapper[4779]: I0320 16:08:03.138838 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7s56" event={"ID":"3533115d-34f9-4980-aab2-75964b9a8b8f","Type":"ContainerStarted","Data":"25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253"} Mar 20 16:08:03 crc kubenswrapper[4779]: I0320 16:08:03.138896 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7s56" event={"ID":"3533115d-34f9-4980-aab2-75964b9a8b8f","Type":"ContainerStarted","Data":"cb071758250ea43cef3d0108bb47785b8c4c876fa2757ec1084bd21e3e77cb30"} Mar 20 16:08:03 crc kubenswrapper[4779]: I0320 16:08:03.148775 4779 generic.go:334] "Generic (PLEG): container finished" podID="50f0a1ab-acea-4e33-820d-ff0e501ce17a" containerID="b8804ce980f67eedd3191a3ff92926da69d76b9ae93204920f6ae0b8a78463a0" exitCode=0 Mar 20 16:08:03 crc kubenswrapper[4779]: I0320 16:08:03.148825 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-7ms57" event={"ID":"50f0a1ab-acea-4e33-820d-ff0e501ce17a","Type":"ContainerDied","Data":"b8804ce980f67eedd3191a3ff92926da69d76b9ae93204920f6ae0b8a78463a0"} Mar 20 16:08:04 crc kubenswrapper[4779]: I0320 16:08:04.160492 4779 generic.go:334] "Generic (PLEG): container finished" podID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerID="25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253" exitCode=0 Mar 20 16:08:04 crc kubenswrapper[4779]: I0320 16:08:04.160663 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7s56" event={"ID":"3533115d-34f9-4980-aab2-75964b9a8b8f","Type":"ContainerDied","Data":"25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253"} Mar 20 16:08:04 crc kubenswrapper[4779]: I0320 16:08:04.161529 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7s56" event={"ID":"3533115d-34f9-4980-aab2-75964b9a8b8f","Type":"ContainerStarted","Data":"cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092"} Mar 20 16:08:04 crc kubenswrapper[4779]: I0320 16:08:04.518340 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-7ms57" Mar 20 16:08:04 crc kubenswrapper[4779]: I0320 16:08:04.666885 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6mh2\" (UniqueName: \"kubernetes.io/projected/50f0a1ab-acea-4e33-820d-ff0e501ce17a-kube-api-access-h6mh2\") pod \"50f0a1ab-acea-4e33-820d-ff0e501ce17a\" (UID: \"50f0a1ab-acea-4e33-820d-ff0e501ce17a\") " Mar 20 16:08:04 crc kubenswrapper[4779]: I0320 16:08:04.674655 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f0a1ab-acea-4e33-820d-ff0e501ce17a-kube-api-access-h6mh2" (OuterVolumeSpecName: "kube-api-access-h6mh2") pod "50f0a1ab-acea-4e33-820d-ff0e501ce17a" (UID: "50f0a1ab-acea-4e33-820d-ff0e501ce17a"). InnerVolumeSpecName "kube-api-access-h6mh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:08:04 crc kubenswrapper[4779]: I0320 16:08:04.769268 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6mh2\" (UniqueName: \"kubernetes.io/projected/50f0a1ab-acea-4e33-820d-ff0e501ce17a-kube-api-access-h6mh2\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:05 crc kubenswrapper[4779]: I0320 16:08:05.171736 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-7ms57" Mar 20 16:08:05 crc kubenswrapper[4779]: I0320 16:08:05.171739 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-7ms57" event={"ID":"50f0a1ab-acea-4e33-820d-ff0e501ce17a","Type":"ContainerDied","Data":"0e21ff3d085696e0d36f05ef3eebfda54287fe4ea2b4a3e9e37a820c51793ad6"} Mar 20 16:08:05 crc kubenswrapper[4779]: I0320 16:08:05.171807 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e21ff3d085696e0d36f05ef3eebfda54287fe4ea2b4a3e9e37a820c51793ad6" Mar 20 16:08:05 crc kubenswrapper[4779]: I0320 16:08:05.614502 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-c2x9d"] Mar 20 16:08:05 crc kubenswrapper[4779]: I0320 16:08:05.628597 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-c2x9d"] Mar 20 16:08:05 crc kubenswrapper[4779]: I0320 16:08:05.823100 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36412fa2-e5a2-4a32-b747-696e33c68757" path="/var/lib/kubelet/pods/36412fa2-e5a2-4a32-b747-696e33c68757/volumes" Mar 20 16:08:11 crc kubenswrapper[4779]: I0320 16:08:11.247157 4779 generic.go:334] "Generic (PLEG): container finished" podID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerID="cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092" exitCode=0 Mar 20 16:08:11 crc kubenswrapper[4779]: I0320 16:08:11.247725 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7s56" event={"ID":"3533115d-34f9-4980-aab2-75964b9a8b8f","Type":"ContainerDied","Data":"cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092"} Mar 20 16:08:12 crc kubenswrapper[4779]: I0320 16:08:12.258006 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7s56" event={"ID":"3533115d-34f9-4980-aab2-75964b9a8b8f","Type":"ContainerStarted","Data":"8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77"} Mar 20 16:08:12 crc kubenswrapper[4779]: I0320 16:08:12.274230 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g7s56" podStartSLOduration=2.759142271 podStartE2EDuration="11.274210641s" podCreationTimestamp="2026-03-20 16:08:01 +0000 UTC" firstStartedPulling="2026-03-20 16:08:03.141836027 +0000 UTC m=+2700.104351827" lastFinishedPulling="2026-03-20 16:08:11.656904397 +0000 UTC m=+2708.619420197" observedRunningTime="2026-03-20 16:08:12.272790647 +0000 UTC m=+2709.235306457" watchObservedRunningTime="2026-03-20 16:08:12.274210641 +0000 UTC m=+2709.236726451" Mar 20 16:08:12 crc kubenswrapper[4779]: I0320 16:08:12.331383 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:12 crc kubenswrapper[4779]: I0320 16:08:12.331439 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:13 crc kubenswrapper[4779]: I0320 16:08:13.377141 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7s56" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="registry-server" probeResult="failure" output=< Mar 20 16:08:13 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 16:08:13 crc kubenswrapper[4779]: > Mar 20 16:08:23 crc kubenswrapper[4779]: I0320 16:08:23.628999 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7s56" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="registry-server" probeResult="failure" output=< Mar 20 16:08:23 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 16:08:23 crc kubenswrapper[4779]: > Mar 20 16:08:32 crc kubenswrapper[4779]: I0320 16:08:32.383329 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:32 crc kubenswrapper[4779]: I0320 16:08:32.431455 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:33 crc kubenswrapper[4779]: I0320 16:08:33.168549 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7s56"] Mar 20 16:08:33 crc kubenswrapper[4779]: I0320 16:08:33.443023 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g7s56" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="registry-server" containerID="cri-o://8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77" gracePeriod=2 Mar 20 16:08:33 crc kubenswrapper[4779]: I0320 16:08:33.887391 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.047460 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-catalog-content\") pod \"3533115d-34f9-4980-aab2-75964b9a8b8f\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.047606 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-utilities\") pod \"3533115d-34f9-4980-aab2-75964b9a8b8f\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.047826 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wnkt\" (UniqueName: \"kubernetes.io/projected/3533115d-34f9-4980-aab2-75964b9a8b8f-kube-api-access-2wnkt\") pod \"3533115d-34f9-4980-aab2-75964b9a8b8f\" (UID: \"3533115d-34f9-4980-aab2-75964b9a8b8f\") " Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.048291 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-utilities" (OuterVolumeSpecName: "utilities") pod "3533115d-34f9-4980-aab2-75964b9a8b8f" (UID: "3533115d-34f9-4980-aab2-75964b9a8b8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.053335 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3533115d-34f9-4980-aab2-75964b9a8b8f-kube-api-access-2wnkt" (OuterVolumeSpecName: "kube-api-access-2wnkt") pod "3533115d-34f9-4980-aab2-75964b9a8b8f" (UID: "3533115d-34f9-4980-aab2-75964b9a8b8f"). InnerVolumeSpecName "kube-api-access-2wnkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.149967 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.150022 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wnkt\" (UniqueName: \"kubernetes.io/projected/3533115d-34f9-4980-aab2-75964b9a8b8f-kube-api-access-2wnkt\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.186641 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3533115d-34f9-4980-aab2-75964b9a8b8f" (UID: "3533115d-34f9-4980-aab2-75964b9a8b8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.252464 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3533115d-34f9-4980-aab2-75964b9a8b8f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.454001 4779 generic.go:334] "Generic (PLEG): container finished" podID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerID="8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77" exitCode=0 Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.454044 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7s56" event={"ID":"3533115d-34f9-4980-aab2-75964b9a8b8f","Type":"ContainerDied","Data":"8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77"} Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.454053 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7s56" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.454075 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7s56" event={"ID":"3533115d-34f9-4980-aab2-75964b9a8b8f","Type":"ContainerDied","Data":"cb071758250ea43cef3d0108bb47785b8c4c876fa2757ec1084bd21e3e77cb30"} Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.454091 4779 scope.go:117] "RemoveContainer" containerID="8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.492255 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7s56"] Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.498488 4779 scope.go:117] "RemoveContainer" containerID="cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.501926 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g7s56"] Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.521082 4779 scope.go:117] "RemoveContainer" containerID="25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.559701 4779 scope.go:117] "RemoveContainer" containerID="8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77" Mar 20 16:08:34 crc kubenswrapper[4779]: E0320 16:08:34.560210 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77\": container with ID starting with 8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77 not found: ID does not exist" containerID="8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.560246 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77"} err="failed to get container status \"8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77\": rpc error: code = NotFound desc = could not find container \"8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77\": container with ID starting with 8df449002c85a8d5962315c74ee5ce3efff93a406fa9dfb0d736ccdcdaadda77 not found: ID does not exist" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.560270 4779 scope.go:117] "RemoveContainer" containerID="cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092" Mar 20 16:08:34 crc kubenswrapper[4779]: E0320 16:08:34.560689 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092\": container with ID starting with cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092 not found: ID does not exist" containerID="cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.560719 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092"} err="failed to get container status \"cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092\": rpc error: code = NotFound desc = could not find container \"cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092\": container with ID starting with cd55cb4ee044d763aaec599f588c5dd9e4e33e7e66dbd013d243d6a9aeb82092 not found: ID does not exist" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.560733 4779 scope.go:117] "RemoveContainer" containerID="25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253" Mar 20 16:08:34 crc kubenswrapper[4779]: E0320 16:08:34.561058 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253\": container with ID starting with 25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253 not found: ID does not exist" containerID="25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253" Mar 20 16:08:34 crc kubenswrapper[4779]: I0320 16:08:34.561117 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253"} err="failed to get container status \"25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253\": rpc error: code = NotFound desc = could not find container \"25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253\": container with ID starting with 25ded3e24104bf331653583e6096f27d137cbe0ebb78e75cfaddeafcf69b6253 not found: ID does not exist" Mar 20 16:08:35 crc kubenswrapper[4779]: I0320 16:08:35.820313 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" path="/var/lib/kubelet/pods/3533115d-34f9-4980-aab2-75964b9a8b8f/volumes" Mar 20 16:08:44 crc kubenswrapper[4779]: I0320 16:08:44.006254 4779 scope.go:117] "RemoveContainer" containerID="8cf774985f35be3b9eceeba2338fbc6f1689bf2677ff39b30da453ff9ccafdab" Mar 20 16:09:20 crc kubenswrapper[4779]: I0320 16:09:20.844304 4779 generic.go:334] "Generic (PLEG): container finished" podID="8e63bf33-6a78-4424-97d5-7eed781817d1" containerID="ddc55cec3cef1e29e93b3b6c33bd5eb69ef1f551e6b9669319e6acc00b422675" exitCode=0 Mar 20 16:09:20 crc kubenswrapper[4779]: I0320 16:09:20.844496 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" event={"ID":"8e63bf33-6a78-4424-97d5-7eed781817d1","Type":"ContainerDied","Data":"ddc55cec3cef1e29e93b3b6c33bd5eb69ef1f551e6b9669319e6acc00b422675"} Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.271855 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.304129 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-inventory\") pod \"8e63bf33-6a78-4424-97d5-7eed781817d1\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.304253 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-2\") pod \"8e63bf33-6a78-4424-97d5-7eed781817d1\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.304273 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-0\") pod \"8e63bf33-6a78-4424-97d5-7eed781817d1\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.304340 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ssh-key-openstack-edpm-ipam\") pod \"8e63bf33-6a78-4424-97d5-7eed781817d1\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.304366 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2sbn\" (UniqueName: \"kubernetes.io/projected/8e63bf33-6a78-4424-97d5-7eed781817d1-kube-api-access-k2sbn\") pod \"8e63bf33-6a78-4424-97d5-7eed781817d1\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.304385 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-1\") pod \"8e63bf33-6a78-4424-97d5-7eed781817d1\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.304417 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-telemetry-combined-ca-bundle\") pod \"8e63bf33-6a78-4424-97d5-7eed781817d1\" (UID: \"8e63bf33-6a78-4424-97d5-7eed781817d1\") " Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.314084 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8e63bf33-6a78-4424-97d5-7eed781817d1" (UID: "8e63bf33-6a78-4424-97d5-7eed781817d1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.316747 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e63bf33-6a78-4424-97d5-7eed781817d1-kube-api-access-k2sbn" (OuterVolumeSpecName: "kube-api-access-k2sbn") pod "8e63bf33-6a78-4424-97d5-7eed781817d1" (UID: "8e63bf33-6a78-4424-97d5-7eed781817d1"). InnerVolumeSpecName "kube-api-access-k2sbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.338169 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "8e63bf33-6a78-4424-97d5-7eed781817d1" (UID: "8e63bf33-6a78-4424-97d5-7eed781817d1"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.353241 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "8e63bf33-6a78-4424-97d5-7eed781817d1" (UID: "8e63bf33-6a78-4424-97d5-7eed781817d1"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.365363 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-inventory" (OuterVolumeSpecName: "inventory") pod "8e63bf33-6a78-4424-97d5-7eed781817d1" (UID: "8e63bf33-6a78-4424-97d5-7eed781817d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.369127 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "8e63bf33-6a78-4424-97d5-7eed781817d1" (UID: "8e63bf33-6a78-4424-97d5-7eed781817d1"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.370362 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e63bf33-6a78-4424-97d5-7eed781817d1" (UID: "8e63bf33-6a78-4424-97d5-7eed781817d1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.407811 4779 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.407864 4779 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.407875 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.407886 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2sbn\" (UniqueName: \"kubernetes.io/projected/8e63bf33-6a78-4424-97d5-7eed781817d1-kube-api-access-k2sbn\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.407896 4779 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.407907 4779 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.407918 4779 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e63bf33-6a78-4424-97d5-7eed781817d1-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.860848 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" event={"ID":"8e63bf33-6a78-4424-97d5-7eed781817d1","Type":"ContainerDied","Data":"f063fa63b2c93108c01397d15fa4cbff0203bf2e31e4e09e2f029536eb8cf612"} Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.861224 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f063fa63b2c93108c01397d15fa4cbff0203bf2e31e4e09e2f029536eb8cf612" Mar 20 16:09:22 crc kubenswrapper[4779]: I0320 16:09:22.860894 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n27c" Mar 20 16:09:55 crc kubenswrapper[4779]: I0320 16:09:55.149900 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:09:55 crc kubenswrapper[4779]: I0320 16:09:55.150498 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.148981 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567050-w4ffw"] Mar 20 16:10:00 crc kubenswrapper[4779]: E0320 16:10:00.150047 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="extract-utilities" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.150064 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="extract-utilities" Mar 20 16:10:00 crc kubenswrapper[4779]: E0320 16:10:00.150087 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="registry-server" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.150095 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="registry-server" Mar 20 16:10:00 crc kubenswrapper[4779]: E0320 16:10:00.150134 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f0a1ab-acea-4e33-820d-ff0e501ce17a" containerName="oc" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.150142 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f0a1ab-acea-4e33-820d-ff0e501ce17a" containerName="oc" Mar 20 16:10:00 crc kubenswrapper[4779]: E0320 16:10:00.150155 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="extract-content" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.150163 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="extract-content" Mar 20 16:10:00 crc kubenswrapper[4779]: E0320 16:10:00.150177 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e63bf33-6a78-4424-97d5-7eed781817d1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.150187 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e63bf33-6a78-4424-97d5-7eed781817d1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.150434 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e63bf33-6a78-4424-97d5-7eed781817d1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.150559 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f0a1ab-acea-4e33-820d-ff0e501ce17a" containerName="oc" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.150576 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="3533115d-34f9-4980-aab2-75964b9a8b8f" containerName="registry-server" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.151558 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-w4ffw" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.154020 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.154391 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.154692 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.157341 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-w4ffw"] Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.279467 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7n9k\" (UniqueName: \"kubernetes.io/projected/b3ec02b7-0eff-4287-89d6-462fff4f37f5-kube-api-access-z7n9k\") pod \"auto-csr-approver-29567050-w4ffw\" (UID: \"b3ec02b7-0eff-4287-89d6-462fff4f37f5\") " pod="openshift-infra/auto-csr-approver-29567050-w4ffw" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.381676 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7n9k\" (UniqueName: \"kubernetes.io/projected/b3ec02b7-0eff-4287-89d6-462fff4f37f5-kube-api-access-z7n9k\") pod \"auto-csr-approver-29567050-w4ffw\" (UID: \"b3ec02b7-0eff-4287-89d6-462fff4f37f5\") " pod="openshift-infra/auto-csr-approver-29567050-w4ffw" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.408810 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7n9k\" (UniqueName: \"kubernetes.io/projected/b3ec02b7-0eff-4287-89d6-462fff4f37f5-kube-api-access-z7n9k\") pod \"auto-csr-approver-29567050-w4ffw\" (UID: \"b3ec02b7-0eff-4287-89d6-462fff4f37f5\") " pod="openshift-infra/auto-csr-approver-29567050-w4ffw" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.470045 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-w4ffw" Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.903318 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:10:00 crc kubenswrapper[4779]: I0320 16:10:00.904582 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-w4ffw"] Mar 20 16:10:01 crc kubenswrapper[4779]: I0320 16:10:01.212991 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-w4ffw" event={"ID":"b3ec02b7-0eff-4287-89d6-462fff4f37f5","Type":"ContainerStarted","Data":"8dd27406c7dbba5e43ef54783233bf2ea48aaafee0afadf857c53fe5ac21be11"} Mar 20 16:10:03 crc kubenswrapper[4779]: I0320 16:10:03.233935 4779 generic.go:334] "Generic (PLEG): container finished" podID="b3ec02b7-0eff-4287-89d6-462fff4f37f5" containerID="5af3d1f7aeffdd0681f58c8567ad72e3db631a389542c85b739b948d69c30c44" exitCode=0 Mar 20 16:10:03 crc kubenswrapper[4779]: I0320 16:10:03.234273 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-w4ffw" event={"ID":"b3ec02b7-0eff-4287-89d6-462fff4f37f5","Type":"ContainerDied","Data":"5af3d1f7aeffdd0681f58c8567ad72e3db631a389542c85b739b948d69c30c44"} Mar 20 16:10:04 crc kubenswrapper[4779]: I0320 16:10:04.583791 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-w4ffw" Mar 20 16:10:04 crc kubenswrapper[4779]: I0320 16:10:04.676960 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7n9k\" (UniqueName: \"kubernetes.io/projected/b3ec02b7-0eff-4287-89d6-462fff4f37f5-kube-api-access-z7n9k\") pod \"b3ec02b7-0eff-4287-89d6-462fff4f37f5\" (UID: \"b3ec02b7-0eff-4287-89d6-462fff4f37f5\") " Mar 20 16:10:04 crc kubenswrapper[4779]: I0320 16:10:04.682459 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ec02b7-0eff-4287-89d6-462fff4f37f5-kube-api-access-z7n9k" (OuterVolumeSpecName: "kube-api-access-z7n9k") pod "b3ec02b7-0eff-4287-89d6-462fff4f37f5" (UID: "b3ec02b7-0eff-4287-89d6-462fff4f37f5"). InnerVolumeSpecName "kube-api-access-z7n9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:10:04 crc kubenswrapper[4779]: I0320 16:10:04.779001 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7n9k\" (UniqueName: \"kubernetes.io/projected/b3ec02b7-0eff-4287-89d6-462fff4f37f5-kube-api-access-z7n9k\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:05 crc kubenswrapper[4779]: I0320 16:10:05.254406 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-w4ffw" event={"ID":"b3ec02b7-0eff-4287-89d6-462fff4f37f5","Type":"ContainerDied","Data":"8dd27406c7dbba5e43ef54783233bf2ea48aaafee0afadf857c53fe5ac21be11"} Mar 20 16:10:05 crc kubenswrapper[4779]: I0320 16:10:05.254733 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd27406c7dbba5e43ef54783233bf2ea48aaafee0afadf857c53fe5ac21be11" Mar 20 16:10:05 crc kubenswrapper[4779]: I0320 16:10:05.254513 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-w4ffw" Mar 20 16:10:05 crc kubenswrapper[4779]: I0320 16:10:05.663361 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nvf8z"] Mar 20 16:10:05 crc kubenswrapper[4779]: I0320 16:10:05.672574 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nvf8z"] Mar 20 16:10:05 crc kubenswrapper[4779]: I0320 16:10:05.821694 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93564d6a-746c-44d4-b947-1294540a732c" path="/var/lib/kubelet/pods/93564d6a-746c-44d4-b947-1294540a732c/volumes" Mar 20 16:10:06 crc kubenswrapper[4779]: I0320 16:10:06.982530 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 16:10:06 crc kubenswrapper[4779]: I0320 16:10:06.983178 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="prometheus" containerID="cri-o://cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120" gracePeriod=600 Mar 20 16:10:06 crc kubenswrapper[4779]: I0320 16:10:06.983295 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="thanos-sidecar" containerID="cri-o://cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470" gracePeriod=600 Mar 20 16:10:06 crc kubenswrapper[4779]: I0320 16:10:06.983452 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="config-reloader" containerID="cri-o://9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063" gracePeriod=600 Mar 20 16:10:07 crc kubenswrapper[4779]: I0320 16:10:07.274310 4779 generic.go:334] "Generic (PLEG): container finished" podID="301642e3-d84f-41af-8795-8042fdccdade" containerID="cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470" exitCode=0 Mar 20 16:10:07 crc kubenswrapper[4779]: I0320 16:10:07.274343 4779 generic.go:334] "Generic (PLEG): container finished" podID="301642e3-d84f-41af-8795-8042fdccdade" containerID="cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120" exitCode=0 Mar 20 16:10:07 crc kubenswrapper[4779]: I0320 16:10:07.274362 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerDied","Data":"cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470"} Mar 20 16:10:07 crc kubenswrapper[4779]: I0320 16:10:07.274387 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerDied","Data":"cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120"} Mar 20 16:10:07 crc kubenswrapper[4779]: I0320 16:10:07.954425 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.042903 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.042940 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-tls-assets\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043068 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043189 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-1\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043213 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/301642e3-d84f-41af-8795-8042fdccdade-config-out\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043252 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043306 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-2\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043340 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-0\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043374 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043444 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-config\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043475 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmg7r\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-kube-api-access-pmg7r\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043493 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-secret-combined-ca-bundle\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.043541 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-thanos-prometheus-http-client-file\") pod \"301642e3-d84f-41af-8795-8042fdccdade\" (UID: \"301642e3-d84f-41af-8795-8042fdccdade\") " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.044996 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.045761 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.046623 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.048697 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.049715 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-config" (OuterVolumeSpecName: "config") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.051087 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-kube-api-access-pmg7r" (OuterVolumeSpecName: "kube-api-access-pmg7r") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "kube-api-access-pmg7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.051483 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.051691 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.052492 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301642e3-d84f-41af-8795-8042fdccdade-config-out" (OuterVolumeSpecName: "config-out") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.056985 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.063739 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.086567 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "pvc-7954f494-7bea-46f8-a58f-4de62431c2b8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.127078 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config" (OuterVolumeSpecName: "web-config") pod "301642e3-d84f-41af-8795-8042fdccdade" (UID: "301642e3-d84f-41af-8795-8042fdccdade"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.145718 4779 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.145975 4779 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146045 4779 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146126 4779 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146186 4779 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146247 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmg7r\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-kube-api-access-pmg7r\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146309 4779 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146365 4779 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146457 4779 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/301642e3-d84f-41af-8795-8042fdccdade-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146514 4779 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/301642e3-d84f-41af-8795-8042fdccdade-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146597 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") on node \"crc\" " Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146655 4779 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/301642e3-d84f-41af-8795-8042fdccdade-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.146727 4779 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/301642e3-d84f-41af-8795-8042fdccdade-config-out\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.183908 4779 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.184079 4779 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7954f494-7bea-46f8-a58f-4de62431c2b8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8") on node "crc" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.248477 4779 reconciler_common.go:293] "Volume detached for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.286871 4779 generic.go:334] "Generic (PLEG): container finished" podID="301642e3-d84f-41af-8795-8042fdccdade" containerID="9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063" exitCode=0 Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.286917 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerDied","Data":"9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063"} Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.286944 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"301642e3-d84f-41af-8795-8042fdccdade","Type":"ContainerDied","Data":"8d54f66b79af85a56279f9b1b164a89d76b6c16ae323a30871027a8629bdc3b3"} Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.286961 4779 scope.go:117] "RemoveContainer" containerID="cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.287019 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.307780 4779 scope.go:117] "RemoveContainer" containerID="9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.343311 4779 scope.go:117] "RemoveContainer" containerID="cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.358126 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.368067 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.368263 4779 scope.go:117] "RemoveContainer" containerID="9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.376691 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.377231 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="thanos-sidecar" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377254 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="thanos-sidecar" Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.377274 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="init-config-reloader" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377283 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="init-config-reloader" Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.377311 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="prometheus" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377317 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="prometheus" Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.377333 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="config-reloader" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377339 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="config-reloader" Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.377363 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ec02b7-0eff-4287-89d6-462fff4f37f5" containerName="oc" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377371 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ec02b7-0eff-4287-89d6-462fff4f37f5" containerName="oc" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377579 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="thanos-sidecar" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377595 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="prometheus" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377603 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ec02b7-0eff-4287-89d6-462fff4f37f5" containerName="oc" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.377621 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="301642e3-d84f-41af-8795-8042fdccdade" containerName="config-reloader" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.379930 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.382918 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.382945 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.383239 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.383745 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.383911 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-cfgwn" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.384180 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.386521 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.387565 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.391919 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.439827 4779 scope.go:117] "RemoveContainer" containerID="cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470" Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.440441 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470\": container with ID starting with cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470 not found: ID does not exist" containerID="cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.440532 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470"} err="failed to get container status \"cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470\": rpc error: code = NotFound desc = could not find container \"cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470\": container with ID starting with cefc812fc8f689b671605fb91effc3fd40540a63141d17df6d8706f08b964470 not found: ID does not exist" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.440633 4779 scope.go:117] "RemoveContainer" containerID="9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063" Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.441536 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063\": container with ID starting with 9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063 not found: ID does not exist" containerID="9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.441596 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063"} err="failed to get container status \"9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063\": rpc error: code = NotFound desc = could not find container \"9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063\": container with ID starting with 9f62ba9deb35721191dd502b1a28133dd9c53cfe4776866d448c42761ba6f063 not found: ID does not exist" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.441630 4779 scope.go:117] "RemoveContainer" containerID="cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120" Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.442533 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120\": container with ID starting with cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120 not found: ID does not exist" containerID="cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.442564 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120"} err="failed to get container status \"cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120\": rpc error: code = NotFound desc = could not find container \"cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120\": container with ID starting with cb10afbfc396d8304bf390816f713e5cc6e42813fced6523d1a817a206dfa120 not found: ID does not exist" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.442583 4779 scope.go:117] "RemoveContainer" containerID="9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e" Mar 20 16:10:08 crc kubenswrapper[4779]: E0320 16:10:08.443900 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e\": container with ID starting with 9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e not found: ID does not exist" containerID="9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.443993 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e"} err="failed to get container status \"9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e\": rpc error: code = NotFound desc = could not find container \"9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e\": container with ID starting with 9a06996bcb2c68cf44a7c9d46e3952e119f39c9c7a4b5fb6b9c26ac932c0ad5e not found: ID does not exist" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453339 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453423 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-config\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453464 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453537 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453585 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81f80e68-614e-43b0-966a-e487faa0db31-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453660 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453706 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453730 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453781 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453804 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81f80e68-614e-43b0-966a-e487faa0db31-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453864 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453928 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.453953 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgcx\" (UniqueName: \"kubernetes.io/projected/81f80e68-614e-43b0-966a-e487faa0db31-kube-api-access-bfgcx\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.555745 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-config\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.555828 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.555912 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.556891 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.555942 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81f80e68-614e-43b0-966a-e487faa0db31-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.556995 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.557036 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.557058 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.557092 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.557160 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81f80e68-614e-43b0-966a-e487faa0db31-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.557194 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.557253 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.557277 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgcx\" (UniqueName: \"kubernetes.io/projected/81f80e68-614e-43b0-966a-e487faa0db31-kube-api-access-bfgcx\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.557385 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.558631 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.559800 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-config\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.560008 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.560322 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81f80e68-614e-43b0-966a-e487faa0db31-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.562146 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81f80e68-614e-43b0-966a-e487faa0db31-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.562666 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.563486 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.564500 4779 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.564538 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/63343ced4decb75fd431545090989bf5e441d5fa06a6828c5f39e243f0a750bf/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.566275 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81f80e68-614e-43b0-966a-e487faa0db31-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.572520 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.579487 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f80e68-614e-43b0-966a-e487faa0db31-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.584772 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgcx\" (UniqueName: \"kubernetes.io/projected/81f80e68-614e-43b0-966a-e487faa0db31-kube-api-access-bfgcx\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.619744 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7954f494-7bea-46f8-a58f-4de62431c2b8\") pod \"prometheus-metric-storage-0\" (UID: \"81f80e68-614e-43b0-966a-e487faa0db31\") " pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:08 crc kubenswrapper[4779]: I0320 16:10:08.769946 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:09 crc kubenswrapper[4779]: I0320 16:10:09.272559 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 16:10:09 crc kubenswrapper[4779]: I0320 16:10:09.302051 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81f80e68-614e-43b0-966a-e487faa0db31","Type":"ContainerStarted","Data":"b9d380d5010134f8703e0e8ba1745d2d18ff2b39ad64cd320d82ffe39afe2e3b"} Mar 20 16:10:09 crc kubenswrapper[4779]: I0320 16:10:09.822280 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301642e3-d84f-41af-8795-8042fdccdade" path="/var/lib/kubelet/pods/301642e3-d84f-41af-8795-8042fdccdade/volumes" Mar 20 16:10:12 crc kubenswrapper[4779]: I0320 16:10:12.326527 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81f80e68-614e-43b0-966a-e487faa0db31","Type":"ContainerStarted","Data":"f686a2a85df10f192682096fbe0c291fa6367d47c71097bccc31f7274c7293b7"} Mar 20 16:10:18 crc kubenswrapper[4779]: I0320 16:10:18.394756 4779 generic.go:334] "Generic (PLEG): container finished" podID="81f80e68-614e-43b0-966a-e487faa0db31" containerID="f686a2a85df10f192682096fbe0c291fa6367d47c71097bccc31f7274c7293b7" exitCode=0 Mar 20 16:10:18 crc kubenswrapper[4779]: I0320 16:10:18.395286 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81f80e68-614e-43b0-966a-e487faa0db31","Type":"ContainerDied","Data":"f686a2a85df10f192682096fbe0c291fa6367d47c71097bccc31f7274c7293b7"} Mar 20 16:10:19 crc kubenswrapper[4779]: I0320 16:10:19.406768 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81f80e68-614e-43b0-966a-e487faa0db31","Type":"ContainerStarted","Data":"fcd9224c7fc35b12b77c402502664c8114cbc61f886b46d9c61dbe34e748c537"} Mar 20 16:10:22 crc kubenswrapper[4779]: I0320 16:10:22.438247 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81f80e68-614e-43b0-966a-e487faa0db31","Type":"ContainerStarted","Data":"b804f27aeeb330d5047ecb578804c25d4fd84851d6f07d703ad6c34ceee758e5"} Mar 20 16:10:22 crc kubenswrapper[4779]: I0320 16:10:22.438818 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81f80e68-614e-43b0-966a-e487faa0db31","Type":"ContainerStarted","Data":"1e1a097dc22746cb22236f8f16b1d9bd5280ce52f0a1ed0d2c2d675d1368d4dc"} Mar 20 16:10:22 crc kubenswrapper[4779]: I0320 16:10:22.483790 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.483767519 podStartE2EDuration="14.483767519s" podCreationTimestamp="2026-03-20 16:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:10:22.475283491 +0000 UTC m=+2839.437799311" watchObservedRunningTime="2026-03-20 16:10:22.483767519 +0000 UTC m=+2839.446283319" Mar 20 16:10:23 crc kubenswrapper[4779]: I0320 16:10:23.773250 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:23 crc kubenswrapper[4779]: I0320 16:10:23.774041 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:23 crc kubenswrapper[4779]: I0320 16:10:23.778098 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:24 crc kubenswrapper[4779]: I0320 16:10:24.460175 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 20 16:10:25 crc kubenswrapper[4779]: I0320 16:10:25.150136 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:10:25 crc kubenswrapper[4779]: I0320 16:10:25.150220 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:10:44 crc kubenswrapper[4779]: I0320 16:10:44.113427 4779 scope.go:117] "RemoveContainer" containerID="003f8a90c7e133e93d79e6ec523e2795616db6f4b954dc2dcc75388bbca63728" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.368527 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.370378 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.373280 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tkd2f" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.373511 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.374386 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.375915 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.382309 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.424855 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.424999 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-config-data\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.425090 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.526789 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.526832 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.526888 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.527165 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-config-data\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.527219 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.527321 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.527428 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6p8q\" (UniqueName: \"kubernetes.io/projected/19f31371-34ad-444f-98e9-1cd99dbe6b24-kube-api-access-n6p8q\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.527457 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.527547 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.527955 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.528154 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-config-data\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.534673 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.629558 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.629643 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.629697 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6p8q\" (UniqueName: \"kubernetes.io/projected/19f31371-34ad-444f-98e9-1cd99dbe6b24-kube-api-access-n6p8q\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.629720 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.629819 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.629845 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.629981 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.630841 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.631462 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.636778 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.637057 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.647650 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6p8q\" (UniqueName: \"kubernetes.io/projected/19f31371-34ad-444f-98e9-1cd99dbe6b24-kube-api-access-n6p8q\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.666194 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " pod="openstack/tempest-tests-tempest" Mar 20 16:10:52 crc kubenswrapper[4779]: I0320 16:10:52.692577 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 16:10:53 crc kubenswrapper[4779]: I0320 16:10:53.299991 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 16:10:53 crc kubenswrapper[4779]: I0320 16:10:53.731420 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"19f31371-34ad-444f-98e9-1cd99dbe6b24","Type":"ContainerStarted","Data":"09ba77b763ee18b7dfaeaa5bb088ddd34639878195494a2002ab0d4119961ee7"} Mar 20 16:10:55 crc kubenswrapper[4779]: I0320 16:10:55.149514 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:10:55 crc kubenswrapper[4779]: I0320 16:10:55.149798 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:10:55 crc kubenswrapper[4779]: I0320 16:10:55.149845 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 16:10:55 crc kubenswrapper[4779]: I0320 16:10:55.150441 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22009c39699cf0c6f0f90ca317132c546c514865bbbd9b00aca011844a148a84"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:10:55 crc kubenswrapper[4779]: I0320 16:10:55.150527 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://22009c39699cf0c6f0f90ca317132c546c514865bbbd9b00aca011844a148a84" gracePeriod=600 Mar 20 16:10:55 crc kubenswrapper[4779]: I0320 16:10:55.756103 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="22009c39699cf0c6f0f90ca317132c546c514865bbbd9b00aca011844a148a84" exitCode=0 Mar 20 16:10:55 crc kubenswrapper[4779]: I0320 16:10:55.756168 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"22009c39699cf0c6f0f90ca317132c546c514865bbbd9b00aca011844a148a84"} Mar 20 16:10:55 crc kubenswrapper[4779]: I0320 16:10:55.756436 4779 scope.go:117] "RemoveContainer" containerID="df4a15a4ae213e72d8c7a655c4b8439341ebe84532eaccd1945e22abb1a25311" Mar 20 16:11:05 crc kubenswrapper[4779]: I0320 16:11:05.399385 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 16:11:05 crc kubenswrapper[4779]: I0320 16:11:05.855624 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f"} Mar 20 16:11:06 crc kubenswrapper[4779]: I0320 16:11:06.867656 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"19f31371-34ad-444f-98e9-1cd99dbe6b24","Type":"ContainerStarted","Data":"5d4ba101a6482e04872e63355fa816358d04d94b9b3241bfd1a4371f899c4dad"} Mar 20 16:11:06 crc kubenswrapper[4779]: I0320 16:11:06.895517 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.80388959 podStartE2EDuration="15.895493824s" podCreationTimestamp="2026-03-20 16:10:51 +0000 UTC" firstStartedPulling="2026-03-20 16:10:53.304543962 +0000 UTC m=+2870.267059762" lastFinishedPulling="2026-03-20 16:11:05.396148196 +0000 UTC m=+2882.358663996" observedRunningTime="2026-03-20 16:11:06.888342518 +0000 UTC m=+2883.850858318" watchObservedRunningTime="2026-03-20 16:11:06.895493824 +0000 UTC m=+2883.858009624" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.624991 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6zm9m"] Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.628085 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.645002 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zm9m"] Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.768942 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-catalog-content\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.769056 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9v2q\" (UniqueName: \"kubernetes.io/projected/0a126824-bdb7-4f00-9da8-064c6d6e17ec-kube-api-access-p9v2q\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.769085 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-utilities\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.870757 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9v2q\" (UniqueName: \"kubernetes.io/projected/0a126824-bdb7-4f00-9da8-064c6d6e17ec-kube-api-access-p9v2q\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.870806 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-utilities\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.870908 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-catalog-content\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.871389 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-catalog-content\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.871852 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-utilities\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:50 crc kubenswrapper[4779]: I0320 16:11:50.892462 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9v2q\" (UniqueName: \"kubernetes.io/projected/0a126824-bdb7-4f00-9da8-064c6d6e17ec-kube-api-access-p9v2q\") pod \"community-operators-6zm9m\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:51 crc kubenswrapper[4779]: I0320 16:11:51.002680 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:11:51 crc kubenswrapper[4779]: I0320 16:11:51.623760 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zm9m"] Mar 20 16:11:52 crc kubenswrapper[4779]: I0320 16:11:52.314724 4779 generic.go:334] "Generic (PLEG): container finished" podID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerID="ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3" exitCode=0 Mar 20 16:11:52 crc kubenswrapper[4779]: I0320 16:11:52.314767 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zm9m" event={"ID":"0a126824-bdb7-4f00-9da8-064c6d6e17ec","Type":"ContainerDied","Data":"ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3"} Mar 20 16:11:52 crc kubenswrapper[4779]: I0320 16:11:52.314791 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zm9m" event={"ID":"0a126824-bdb7-4f00-9da8-064c6d6e17ec","Type":"ContainerStarted","Data":"6c5d497498508b09fea3907b6012377c0c180680e9aaf9c5804c4882b578f8a9"} Mar 20 16:11:53 crc kubenswrapper[4779]: I0320 16:11:53.324165 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zm9m" event={"ID":"0a126824-bdb7-4f00-9da8-064c6d6e17ec","Type":"ContainerStarted","Data":"e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee"} Mar 20 16:11:55 crc kubenswrapper[4779]: I0320 16:11:55.345991 4779 generic.go:334] "Generic (PLEG): container finished" podID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerID="e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee" exitCode=0 Mar 20 16:11:55 crc kubenswrapper[4779]: I0320 16:11:55.346076 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zm9m" event={"ID":"0a126824-bdb7-4f00-9da8-064c6d6e17ec","Type":"ContainerDied","Data":"e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee"} Mar 20 16:11:56 crc kubenswrapper[4779]: I0320 16:11:56.356912 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zm9m" event={"ID":"0a126824-bdb7-4f00-9da8-064c6d6e17ec","Type":"ContainerStarted","Data":"7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504"} Mar 20 16:11:56 crc kubenswrapper[4779]: I0320 16:11:56.373818 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6zm9m" podStartSLOduration=2.83715776 podStartE2EDuration="6.373802767s" podCreationTimestamp="2026-03-20 16:11:50 +0000 UTC" firstStartedPulling="2026-03-20 16:11:52.316748874 +0000 UTC m=+2929.279264674" lastFinishedPulling="2026-03-20 16:11:55.85339388 +0000 UTC m=+2932.815909681" observedRunningTime="2026-03-20 16:11:56.372909365 +0000 UTC m=+2933.335425165" watchObservedRunningTime="2026-03-20 16:11:56.373802767 +0000 UTC m=+2933.336318567" Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.142446 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fnhnz"] Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.145175 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.149269 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.149652 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.149876 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.150060 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fnhnz"] Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.259274 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bxvf\" (UniqueName: \"kubernetes.io/projected/3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9-kube-api-access-4bxvf\") pod \"auto-csr-approver-29567052-fnhnz\" (UID: \"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9\") " pod="openshift-infra/auto-csr-approver-29567052-fnhnz" Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.361340 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bxvf\" (UniqueName: \"kubernetes.io/projected/3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9-kube-api-access-4bxvf\") pod \"auto-csr-approver-29567052-fnhnz\" (UID: \"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9\") " pod="openshift-infra/auto-csr-approver-29567052-fnhnz" Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.382207 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bxvf\" (UniqueName: \"kubernetes.io/projected/3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9-kube-api-access-4bxvf\") pod \"auto-csr-approver-29567052-fnhnz\" (UID: \"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9\") " pod="openshift-infra/auto-csr-approver-29567052-fnhnz" Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.466502 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" Mar 20 16:12:00 crc kubenswrapper[4779]: W0320 16:12:00.900643 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4e99dc_e4a7_4bbe_ae42_7b46c0875bc9.slice/crio-8b1d77e81b0f55b803ebaa5210fbdcc140b5bb8dc754df214adb41d48b1f8615 WatchSource:0}: Error finding container 8b1d77e81b0f55b803ebaa5210fbdcc140b5bb8dc754df214adb41d48b1f8615: Status 404 returned error can't find the container with id 8b1d77e81b0f55b803ebaa5210fbdcc140b5bb8dc754df214adb41d48b1f8615 Mar 20 16:12:00 crc kubenswrapper[4779]: I0320 16:12:00.901050 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fnhnz"] Mar 20 16:12:01 crc kubenswrapper[4779]: I0320 16:12:01.004136 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:12:01 crc kubenswrapper[4779]: I0320 16:12:01.004735 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:12:01 crc kubenswrapper[4779]: I0320 16:12:01.054323 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:12:01 crc kubenswrapper[4779]: I0320 16:12:01.399604 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" event={"ID":"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9","Type":"ContainerStarted","Data":"8b1d77e81b0f55b803ebaa5210fbdcc140b5bb8dc754df214adb41d48b1f8615"} Mar 20 16:12:01 crc kubenswrapper[4779]: I0320 16:12:01.451818 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:12:01 crc kubenswrapper[4779]: I0320 16:12:01.497160 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6zm9m"] Mar 20 16:12:03 crc kubenswrapper[4779]: I0320 16:12:03.433618 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6zm9m" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerName="registry-server" containerID="cri-o://7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504" gracePeriod=2 Mar 20 16:12:03 crc kubenswrapper[4779]: I0320 16:12:03.435382 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" event={"ID":"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9","Type":"ContainerStarted","Data":"c591d48bb59c22bdf6540d24d359476316163842c02a7e45a6925091c7c19144"} Mar 20 16:12:03 crc kubenswrapper[4779]: I0320 16:12:03.478191 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" podStartSLOduration=1.296301044 podStartE2EDuration="3.478170443s" podCreationTimestamp="2026-03-20 16:12:00 +0000 UTC" firstStartedPulling="2026-03-20 16:12:00.903586065 +0000 UTC m=+2937.866101855" lastFinishedPulling="2026-03-20 16:12:03.085455454 +0000 UTC m=+2940.047971254" observedRunningTime="2026-03-20 16:12:03.458972701 +0000 UTC m=+2940.421488511" watchObservedRunningTime="2026-03-20 16:12:03.478170443 +0000 UTC m=+2940.440686243" Mar 20 16:12:03 crc kubenswrapper[4779]: I0320 16:12:03.932049 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.027618 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-utilities\") pod \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.027841 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-catalog-content\") pod \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.028617 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-utilities" (OuterVolumeSpecName: "utilities") pod "0a126824-bdb7-4f00-9da8-064c6d6e17ec" (UID: "0a126824-bdb7-4f00-9da8-064c6d6e17ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.033357 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9v2q\" (UniqueName: \"kubernetes.io/projected/0a126824-bdb7-4f00-9da8-064c6d6e17ec-kube-api-access-p9v2q\") pod \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\" (UID: \"0a126824-bdb7-4f00-9da8-064c6d6e17ec\") " Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.034334 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.039058 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a126824-bdb7-4f00-9da8-064c6d6e17ec-kube-api-access-p9v2q" (OuterVolumeSpecName: "kube-api-access-p9v2q") pod "0a126824-bdb7-4f00-9da8-064c6d6e17ec" (UID: "0a126824-bdb7-4f00-9da8-064c6d6e17ec"). InnerVolumeSpecName "kube-api-access-p9v2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.084330 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a126824-bdb7-4f00-9da8-064c6d6e17ec" (UID: "0a126824-bdb7-4f00-9da8-064c6d6e17ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.136063 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a126824-bdb7-4f00-9da8-064c6d6e17ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.136096 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9v2q\" (UniqueName: \"kubernetes.io/projected/0a126824-bdb7-4f00-9da8-064c6d6e17ec-kube-api-access-p9v2q\") on node \"crc\" DevicePath \"\"" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.461351 4779 generic.go:334] "Generic (PLEG): container finished" podID="3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9" containerID="c591d48bb59c22bdf6540d24d359476316163842c02a7e45a6925091c7c19144" exitCode=0 Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.461469 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" event={"ID":"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9","Type":"ContainerDied","Data":"c591d48bb59c22bdf6540d24d359476316163842c02a7e45a6925091c7c19144"} Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.463653 4779 generic.go:334] "Generic (PLEG): container finished" podID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerID="7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504" exitCode=0 Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.463701 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zm9m" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.463711 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zm9m" event={"ID":"0a126824-bdb7-4f00-9da8-064c6d6e17ec","Type":"ContainerDied","Data":"7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504"} Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.463775 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zm9m" event={"ID":"0a126824-bdb7-4f00-9da8-064c6d6e17ec","Type":"ContainerDied","Data":"6c5d497498508b09fea3907b6012377c0c180680e9aaf9c5804c4882b578f8a9"} Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.463815 4779 scope.go:117] "RemoveContainer" containerID="7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.511191 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6zm9m"] Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.519254 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6zm9m"] Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.520412 4779 scope.go:117] "RemoveContainer" containerID="e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.543489 4779 scope.go:117] "RemoveContainer" containerID="ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.579125 4779 scope.go:117] "RemoveContainer" containerID="7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504" Mar 20 16:12:04 crc kubenswrapper[4779]: E0320 16:12:04.580035 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504\": container with ID starting with 7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504 not found: ID does not exist" containerID="7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.580071 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504"} err="failed to get container status \"7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504\": rpc error: code = NotFound desc = could not find container \"7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504\": container with ID starting with 7c0d832ea4cc24df3d28583c3fcbc6499c071044f334be5ba89b7aa8f0829504 not found: ID does not exist" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.580092 4779 scope.go:117] "RemoveContainer" containerID="e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee" Mar 20 16:12:04 crc kubenswrapper[4779]: E0320 16:12:04.580755 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee\": container with ID starting with e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee not found: ID does not exist" containerID="e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.580825 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee"} err="failed to get container status \"e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee\": rpc error: code = NotFound desc = could not find container \"e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee\": container with ID starting with e2b10a57f9127e73927955ec8a67214228ef4abeea7f15b53a6544651327daee not found: ID does not exist" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.580870 4779 scope.go:117] "RemoveContainer" containerID="ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3" Mar 20 16:12:04 crc kubenswrapper[4779]: E0320 16:12:04.581698 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3\": container with ID starting with ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3 not found: ID does not exist" containerID="ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3" Mar 20 16:12:04 crc kubenswrapper[4779]: I0320 16:12:04.581726 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3"} err="failed to get container status \"ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3\": rpc error: code = NotFound desc = could not find container \"ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3\": container with ID starting with ec6722779b118cf8da4c8103dc21155750d325c8d4078ef6bec104a7871f8be3 not found: ID does not exist" Mar 20 16:12:05 crc kubenswrapper[4779]: I0320 16:12:05.814717 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" Mar 20 16:12:05 crc kubenswrapper[4779]: I0320 16:12:05.819925 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" path="/var/lib/kubelet/pods/0a126824-bdb7-4f00-9da8-064c6d6e17ec/volumes" Mar 20 16:12:05 crc kubenswrapper[4779]: I0320 16:12:05.980703 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bxvf\" (UniqueName: \"kubernetes.io/projected/3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9-kube-api-access-4bxvf\") pod \"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9\" (UID: \"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9\") " Mar 20 16:12:05 crc kubenswrapper[4779]: I0320 16:12:05.986189 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9-kube-api-access-4bxvf" (OuterVolumeSpecName: "kube-api-access-4bxvf") pod "3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9" (UID: "3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9"). InnerVolumeSpecName "kube-api-access-4bxvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:12:06 crc kubenswrapper[4779]: I0320 16:12:06.082776 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bxvf\" (UniqueName: \"kubernetes.io/projected/3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9-kube-api-access-4bxvf\") on node \"crc\" DevicePath \"\"" Mar 20 16:12:06 crc kubenswrapper[4779]: I0320 16:12:06.482691 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" event={"ID":"3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9","Type":"ContainerDied","Data":"8b1d77e81b0f55b803ebaa5210fbdcc140b5bb8dc754df214adb41d48b1f8615"} Mar 20 16:12:06 crc kubenswrapper[4779]: I0320 16:12:06.483057 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1d77e81b0f55b803ebaa5210fbdcc140b5bb8dc754df214adb41d48b1f8615" Mar 20 16:12:06 crc kubenswrapper[4779]: I0320 16:12:06.482746 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-fnhnz" Mar 20 16:12:06 crc kubenswrapper[4779]: I0320 16:12:06.530450 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-wc6s8"] Mar 20 16:12:06 crc kubenswrapper[4779]: I0320 16:12:06.539558 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-wc6s8"] Mar 20 16:12:07 crc kubenswrapper[4779]: I0320 16:12:07.818736 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4aa484e-e617-4cc7-a213-995589465356" path="/var/lib/kubelet/pods/f4aa484e-e617-4cc7-a213-995589465356/volumes" Mar 20 16:12:44 crc kubenswrapper[4779]: I0320 16:12:44.313998 4779 scope.go:117] "RemoveContainer" containerID="ffbda15062f12cbed332b3e43304b77f1e62fcf681c66aa037ba74e64f8242c0" Mar 20 16:13:25 crc kubenswrapper[4779]: I0320 16:13:25.149698 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:13:25 crc kubenswrapper[4779]: I0320 16:13:25.150265 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:13:55 crc kubenswrapper[4779]: I0320 16:13:55.150433 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:13:55 crc kubenswrapper[4779]: I0320 16:13:55.151101 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.740581 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cl7t2"] Mar 20 16:13:58 crc kubenswrapper[4779]: E0320 16:13:58.741938 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerName="extract-content" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.741961 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerName="extract-content" Mar 20 16:13:58 crc kubenswrapper[4779]: E0320 16:13:58.741978 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerName="extract-utilities" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.741992 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerName="extract-utilities" Mar 20 16:13:58 crc kubenswrapper[4779]: E0320 16:13:58.742020 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9" containerName="oc" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.742032 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9" containerName="oc" Mar 20 16:13:58 crc kubenswrapper[4779]: E0320 16:13:58.742070 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerName="registry-server" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.742083 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerName="registry-server" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.742463 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a126824-bdb7-4f00-9da8-064c6d6e17ec" containerName="registry-server" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.742495 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9" containerName="oc" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.744960 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.756081 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cl7t2"] Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.925230 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-utilities\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.925299 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-catalog-content\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:58 crc kubenswrapper[4779]: I0320 16:13:58.925687 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl554\" (UniqueName: \"kubernetes.io/projected/71280ecb-6b79-48e8-8ca0-7a588c853ec7-kube-api-access-bl554\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:59 crc kubenswrapper[4779]: I0320 16:13:59.028408 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl554\" (UniqueName: \"kubernetes.io/projected/71280ecb-6b79-48e8-8ca0-7a588c853ec7-kube-api-access-bl554\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:59 crc kubenswrapper[4779]: I0320 16:13:59.029063 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-utilities\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:59 crc kubenswrapper[4779]: I0320 16:13:59.029629 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-utilities\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:59 crc kubenswrapper[4779]: I0320 16:13:59.029790 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-catalog-content\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:59 crc kubenswrapper[4779]: I0320 16:13:59.030124 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-catalog-content\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:59 crc kubenswrapper[4779]: I0320 16:13:59.053197 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl554\" (UniqueName: \"kubernetes.io/projected/71280ecb-6b79-48e8-8ca0-7a588c853ec7-kube-api-access-bl554\") pod \"certified-operators-cl7t2\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:59 crc kubenswrapper[4779]: I0320 16:13:59.070887 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:13:59 crc kubenswrapper[4779]: I0320 16:13:59.559903 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cl7t2"] Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.029276 4779 generic.go:334] "Generic (PLEG): container finished" podID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerID="dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5" exitCode=0 Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.029323 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7t2" event={"ID":"71280ecb-6b79-48e8-8ca0-7a588c853ec7","Type":"ContainerDied","Data":"dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5"} Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.029352 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7t2" event={"ID":"71280ecb-6b79-48e8-8ca0-7a588c853ec7","Type":"ContainerStarted","Data":"8d6a9611203929043963b360c0021057e58c6fa3489756e1c032f3e94189473e"} Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.143198 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567054-2s45s"] Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.144816 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-2s45s" Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.148382 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.148870 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.149342 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.151724 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-2s45s"] Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.266120 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrb29\" (UniqueName: \"kubernetes.io/projected/a7bfb129-2346-4e46-afb8-81c888be46dc-kube-api-access-nrb29\") pod \"auto-csr-approver-29567054-2s45s\" (UID: \"a7bfb129-2346-4e46-afb8-81c888be46dc\") " pod="openshift-infra/auto-csr-approver-29567054-2s45s" Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.368207 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrb29\" (UniqueName: \"kubernetes.io/projected/a7bfb129-2346-4e46-afb8-81c888be46dc-kube-api-access-nrb29\") pod \"auto-csr-approver-29567054-2s45s\" (UID: \"a7bfb129-2346-4e46-afb8-81c888be46dc\") " pod="openshift-infra/auto-csr-approver-29567054-2s45s" Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.390814 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrb29\" (UniqueName: \"kubernetes.io/projected/a7bfb129-2346-4e46-afb8-81c888be46dc-kube-api-access-nrb29\") pod \"auto-csr-approver-29567054-2s45s\" (UID: \"a7bfb129-2346-4e46-afb8-81c888be46dc\") " pod="openshift-infra/auto-csr-approver-29567054-2s45s" Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.460937 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-2s45s" Mar 20 16:14:00 crc kubenswrapper[4779]: I0320 16:14:00.982760 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-2s45s"] Mar 20 16:14:00 crc kubenswrapper[4779]: W0320 16:14:00.985562 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7bfb129_2346_4e46_afb8_81c888be46dc.slice/crio-6e052864384ad4f255e3eb6a214ee1dfefe871742dbbcdee37f41f9d959b78ab WatchSource:0}: Error finding container 6e052864384ad4f255e3eb6a214ee1dfefe871742dbbcdee37f41f9d959b78ab: Status 404 returned error can't find the container with id 6e052864384ad4f255e3eb6a214ee1dfefe871742dbbcdee37f41f9d959b78ab Mar 20 16:14:01 crc kubenswrapper[4779]: I0320 16:14:01.039655 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-2s45s" event={"ID":"a7bfb129-2346-4e46-afb8-81c888be46dc","Type":"ContainerStarted","Data":"6e052864384ad4f255e3eb6a214ee1dfefe871742dbbcdee37f41f9d959b78ab"} Mar 20 16:14:03 crc kubenswrapper[4779]: I0320 16:14:03.056595 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7t2" event={"ID":"71280ecb-6b79-48e8-8ca0-7a588c853ec7","Type":"ContainerStarted","Data":"4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b"} Mar 20 16:14:03 crc kubenswrapper[4779]: I0320 16:14:03.059014 4779 generic.go:334] "Generic (PLEG): container finished" podID="a7bfb129-2346-4e46-afb8-81c888be46dc" containerID="b329ddf0121ebabeb736a53e44487aaf5f8f3b3e28fb7b6ef0e7b4ff756d891e" exitCode=0 Mar 20 16:14:03 crc kubenswrapper[4779]: I0320 16:14:03.059047 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-2s45s" event={"ID":"a7bfb129-2346-4e46-afb8-81c888be46dc","Type":"ContainerDied","Data":"b329ddf0121ebabeb736a53e44487aaf5f8f3b3e28fb7b6ef0e7b4ff756d891e"} Mar 20 16:14:04 crc kubenswrapper[4779]: I0320 16:14:04.410420 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-2s45s" Mar 20 16:14:04 crc kubenswrapper[4779]: I0320 16:14:04.554235 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrb29\" (UniqueName: \"kubernetes.io/projected/a7bfb129-2346-4e46-afb8-81c888be46dc-kube-api-access-nrb29\") pod \"a7bfb129-2346-4e46-afb8-81c888be46dc\" (UID: \"a7bfb129-2346-4e46-afb8-81c888be46dc\") " Mar 20 16:14:04 crc kubenswrapper[4779]: I0320 16:14:04.561593 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7bfb129-2346-4e46-afb8-81c888be46dc-kube-api-access-nrb29" (OuterVolumeSpecName: "kube-api-access-nrb29") pod "a7bfb129-2346-4e46-afb8-81c888be46dc" (UID: "a7bfb129-2346-4e46-afb8-81c888be46dc"). InnerVolumeSpecName "kube-api-access-nrb29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:14:04 crc kubenswrapper[4779]: I0320 16:14:04.657391 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrb29\" (UniqueName: \"kubernetes.io/projected/a7bfb129-2346-4e46-afb8-81c888be46dc-kube-api-access-nrb29\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:05 crc kubenswrapper[4779]: I0320 16:14:05.077626 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-2s45s" event={"ID":"a7bfb129-2346-4e46-afb8-81c888be46dc","Type":"ContainerDied","Data":"6e052864384ad4f255e3eb6a214ee1dfefe871742dbbcdee37f41f9d959b78ab"} Mar 20 16:14:05 crc kubenswrapper[4779]: I0320 16:14:05.077687 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e052864384ad4f255e3eb6a214ee1dfefe871742dbbcdee37f41f9d959b78ab" Mar 20 16:14:05 crc kubenswrapper[4779]: I0320 16:14:05.077705 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-2s45s" Mar 20 16:14:05 crc kubenswrapper[4779]: I0320 16:14:05.080283 4779 generic.go:334] "Generic (PLEG): container finished" podID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerID="4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b" exitCode=0 Mar 20 16:14:05 crc kubenswrapper[4779]: I0320 16:14:05.080330 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7t2" event={"ID":"71280ecb-6b79-48e8-8ca0-7a588c853ec7","Type":"ContainerDied","Data":"4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b"} Mar 20 16:14:05 crc kubenswrapper[4779]: I0320 16:14:05.474419 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-7ms57"] Mar 20 16:14:05 crc kubenswrapper[4779]: I0320 16:14:05.486827 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-7ms57"] Mar 20 16:14:05 crc kubenswrapper[4779]: I0320 16:14:05.821130 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f0a1ab-acea-4e33-820d-ff0e501ce17a" path="/var/lib/kubelet/pods/50f0a1ab-acea-4e33-820d-ff0e501ce17a/volumes" Mar 20 16:14:06 crc kubenswrapper[4779]: I0320 16:14:06.090990 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7t2" event={"ID":"71280ecb-6b79-48e8-8ca0-7a588c853ec7","Type":"ContainerStarted","Data":"2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928"} Mar 20 16:14:06 crc kubenswrapper[4779]: I0320 16:14:06.108591 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cl7t2" podStartSLOduration=2.644765429 podStartE2EDuration="8.108574541s" podCreationTimestamp="2026-03-20 16:13:58 +0000 UTC" firstStartedPulling="2026-03-20 16:14:00.031285936 +0000 UTC m=+3056.993801736" lastFinishedPulling="2026-03-20 16:14:05.495095048 +0000 UTC m=+3062.457610848" observedRunningTime="2026-03-20 16:14:06.106506149 +0000 UTC m=+3063.069021959" watchObservedRunningTime="2026-03-20 16:14:06.108574541 +0000 UTC m=+3063.071090341" Mar 20 16:14:09 crc kubenswrapper[4779]: I0320 16:14:09.071424 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:14:09 crc kubenswrapper[4779]: I0320 16:14:09.072073 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:14:09 crc kubenswrapper[4779]: I0320 16:14:09.116330 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.140067 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.196526 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cl7t2"] Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.199263 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cl7t2" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerName="registry-server" containerID="cri-o://2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928" gracePeriod=2 Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.670967 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.842183 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-utilities\") pod \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.842281 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl554\" (UniqueName: \"kubernetes.io/projected/71280ecb-6b79-48e8-8ca0-7a588c853ec7-kube-api-access-bl554\") pod \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.842355 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-catalog-content\") pod \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\" (UID: \"71280ecb-6b79-48e8-8ca0-7a588c853ec7\") " Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.844907 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-utilities" (OuterVolumeSpecName: "utilities") pod "71280ecb-6b79-48e8-8ca0-7a588c853ec7" (UID: "71280ecb-6b79-48e8-8ca0-7a588c853ec7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.851289 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71280ecb-6b79-48e8-8ca0-7a588c853ec7-kube-api-access-bl554" (OuterVolumeSpecName: "kube-api-access-bl554") pod "71280ecb-6b79-48e8-8ca0-7a588c853ec7" (UID: "71280ecb-6b79-48e8-8ca0-7a588c853ec7"). InnerVolumeSpecName "kube-api-access-bl554". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.891707 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71280ecb-6b79-48e8-8ca0-7a588c853ec7" (UID: "71280ecb-6b79-48e8-8ca0-7a588c853ec7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.944487 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.944530 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71280ecb-6b79-48e8-8ca0-7a588c853ec7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:19 crc kubenswrapper[4779]: I0320 16:14:19.944542 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl554\" (UniqueName: \"kubernetes.io/projected/71280ecb-6b79-48e8-8ca0-7a588c853ec7-kube-api-access-bl554\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.208348 4779 generic.go:334] "Generic (PLEG): container finished" podID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerID="2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928" exitCode=0 Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.208388 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7t2" event={"ID":"71280ecb-6b79-48e8-8ca0-7a588c853ec7","Type":"ContainerDied","Data":"2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928"} Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.208414 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7t2" event={"ID":"71280ecb-6b79-48e8-8ca0-7a588c853ec7","Type":"ContainerDied","Data":"8d6a9611203929043963b360c0021057e58c6fa3489756e1c032f3e94189473e"} Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.208410 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cl7t2" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.208496 4779 scope.go:117] "RemoveContainer" containerID="2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.231721 4779 scope.go:117] "RemoveContainer" containerID="4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.248366 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cl7t2"] Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.262227 4779 scope.go:117] "RemoveContainer" containerID="dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.264236 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cl7t2"] Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.304600 4779 scope.go:117] "RemoveContainer" containerID="2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928" Mar 20 16:14:20 crc kubenswrapper[4779]: E0320 16:14:20.304920 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928\": container with ID starting with 2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928 not found: ID does not exist" containerID="2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.304957 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928"} err="failed to get container status \"2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928\": rpc error: code = NotFound desc = could not find container \"2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928\": container with ID starting with 2941db6b20dd82ea0fc8f835b3e727430a085b95f9450a8664ced3ffd0572928 not found: ID does not exist" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.304976 4779 scope.go:117] "RemoveContainer" containerID="4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b" Mar 20 16:14:20 crc kubenswrapper[4779]: E0320 16:14:20.305421 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b\": container with ID starting with 4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b not found: ID does not exist" containerID="4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.305445 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b"} err="failed to get container status \"4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b\": rpc error: code = NotFound desc = could not find container \"4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b\": container with ID starting with 4c9874fc285f7ad87c9b21708d4c9c460d78caf679643ca4482ba30e93f9800b not found: ID does not exist" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.305459 4779 scope.go:117] "RemoveContainer" containerID="dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5" Mar 20 16:14:20 crc kubenswrapper[4779]: E0320 16:14:20.305707 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5\": container with ID starting with dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5 not found: ID does not exist" containerID="dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5" Mar 20 16:14:20 crc kubenswrapper[4779]: I0320 16:14:20.305746 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5"} err="failed to get container status \"dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5\": rpc error: code = NotFound desc = could not find container \"dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5\": container with ID starting with dbcc3284a34ab2170a417e8a7e0dfa94de7c68da04a3d627a0819a4ff1d686d5 not found: ID does not exist" Mar 20 16:14:21 crc kubenswrapper[4779]: I0320 16:14:21.829251 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" path="/var/lib/kubelet/pods/71280ecb-6b79-48e8-8ca0-7a588c853ec7/volumes" Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.149845 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.150237 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.150275 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.150944 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.151260 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" gracePeriod=600 Mar 20 16:14:25 crc kubenswrapper[4779]: E0320 16:14:25.273288 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.517471 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" exitCode=0 Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.517517 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f"} Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.517553 4779 scope.go:117] "RemoveContainer" containerID="22009c39699cf0c6f0f90ca317132c546c514865bbbd9b00aca011844a148a84" Mar 20 16:14:25 crc kubenswrapper[4779]: I0320 16:14:25.518237 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:14:25 crc kubenswrapper[4779]: E0320 16:14:25.518547 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:14:37 crc kubenswrapper[4779]: I0320 16:14:37.808632 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:14:37 crc kubenswrapper[4779]: E0320 16:14:37.809351 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:14:44 crc kubenswrapper[4779]: I0320 16:14:44.426480 4779 scope.go:117] "RemoveContainer" containerID="b8804ce980f67eedd3191a3ff92926da69d76b9ae93204920f6ae0b8a78463a0" Mar 20 16:14:48 crc kubenswrapper[4779]: I0320 16:14:48.812319 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:14:48 crc kubenswrapper[4779]: E0320 16:14:48.813211 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.147840 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd"] Mar 20 16:15:00 crc kubenswrapper[4779]: E0320 16:15:00.148690 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerName="extract-utilities" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.148702 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerName="extract-utilities" Mar 20 16:15:00 crc kubenswrapper[4779]: E0320 16:15:00.148712 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerName="extract-content" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.148718 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerName="extract-content" Mar 20 16:15:00 crc kubenswrapper[4779]: E0320 16:15:00.148727 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerName="registry-server" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.148733 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerName="registry-server" Mar 20 16:15:00 crc kubenswrapper[4779]: E0320 16:15:00.148756 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7bfb129-2346-4e46-afb8-81c888be46dc" containerName="oc" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.148764 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7bfb129-2346-4e46-afb8-81c888be46dc" containerName="oc" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.148938 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7bfb129-2346-4e46-afb8-81c888be46dc" containerName="oc" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.148957 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="71280ecb-6b79-48e8-8ca0-7a588c853ec7" containerName="registry-server" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.149554 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.152814 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.152864 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.161381 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd"] Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.309936 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-secret-volume\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.310040 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ttb\" (UniqueName: \"kubernetes.io/projected/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-kube-api-access-w2ttb\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.310280 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-config-volume\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.411721 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-config-volume\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.411822 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-secret-volume\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.411896 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ttb\" (UniqueName: \"kubernetes.io/projected/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-kube-api-access-w2ttb\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.413215 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-config-volume\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.423302 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-secret-volume\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.427831 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ttb\" (UniqueName: \"kubernetes.io/projected/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-kube-api-access-w2ttb\") pod \"collect-profiles-29567055-v4qzd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.476943 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:00 crc kubenswrapper[4779]: I0320 16:15:00.957906 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd"] Mar 20 16:15:01 crc kubenswrapper[4779]: I0320 16:15:01.809276 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:15:01 crc kubenswrapper[4779]: E0320 16:15:01.809797 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:15:01 crc kubenswrapper[4779]: I0320 16:15:01.824405 4779 generic.go:334] "Generic (PLEG): container finished" podID="c2ef565d-821f-4b5a-83ff-45bd3d562ebd" containerID="797650f3322a64f35b70d41704b028eaf33ff25dfd53cfcc48d3dc8552093518" exitCode=0 Mar 20 16:15:01 crc kubenswrapper[4779]: I0320 16:15:01.824458 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" event={"ID":"c2ef565d-821f-4b5a-83ff-45bd3d562ebd","Type":"ContainerDied","Data":"797650f3322a64f35b70d41704b028eaf33ff25dfd53cfcc48d3dc8552093518"} Mar 20 16:15:01 crc kubenswrapper[4779]: I0320 16:15:01.824501 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" event={"ID":"c2ef565d-821f-4b5a-83ff-45bd3d562ebd","Type":"ContainerStarted","Data":"22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20"} Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.179236 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.272035 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-secret-volume\") pod \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.272219 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2ttb\" (UniqueName: \"kubernetes.io/projected/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-kube-api-access-w2ttb\") pod \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.272266 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-config-volume\") pod \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\" (UID: \"c2ef565d-821f-4b5a-83ff-45bd3d562ebd\") " Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.273419 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2ef565d-821f-4b5a-83ff-45bd3d562ebd" (UID: "c2ef565d-821f-4b5a-83ff-45bd3d562ebd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.279296 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2ef565d-821f-4b5a-83ff-45bd3d562ebd" (UID: "c2ef565d-821f-4b5a-83ff-45bd3d562ebd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.279401 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-kube-api-access-w2ttb" (OuterVolumeSpecName: "kube-api-access-w2ttb") pod "c2ef565d-821f-4b5a-83ff-45bd3d562ebd" (UID: "c2ef565d-821f-4b5a-83ff-45bd3d562ebd"). InnerVolumeSpecName "kube-api-access-w2ttb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.374692 4779 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.374727 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2ttb\" (UniqueName: \"kubernetes.io/projected/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-kube-api-access-w2ttb\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.374737 4779 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2ef565d-821f-4b5a-83ff-45bd3d562ebd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.844595 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" event={"ID":"c2ef565d-821f-4b5a-83ff-45bd3d562ebd","Type":"ContainerDied","Data":"22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20"} Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.844643 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20" Mar 20 16:15:03 crc kubenswrapper[4779]: I0320 16:15:03.844656 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-v4qzd" Mar 20 16:15:04 crc kubenswrapper[4779]: I0320 16:15:04.270835 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw"] Mar 20 16:15:04 crc kubenswrapper[4779]: I0320 16:15:04.283482 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-9g8lw"] Mar 20 16:15:05 crc kubenswrapper[4779]: E0320 16:15:05.706704 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice/crio-22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20\": RecentStats: unable to find data in memory cache]" Mar 20 16:15:05 crc kubenswrapper[4779]: I0320 16:15:05.827527 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4981969-176f-47c6-b265-df9f5668838b" path="/var/lib/kubelet/pods/d4981969-176f-47c6-b265-df9f5668838b/volumes" Mar 20 16:15:13 crc kubenswrapper[4779]: I0320 16:15:13.816716 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:15:13 crc kubenswrapper[4779]: E0320 16:15:13.817514 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:15:15 crc kubenswrapper[4779]: E0320 16:15:15.987249 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice/crio-22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice\": RecentStats: unable to find data in memory cache]" Mar 20 16:15:26 crc kubenswrapper[4779]: E0320 16:15:26.260026 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice/crio-22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20\": RecentStats: unable to find data in memory cache]" Mar 20 16:15:26 crc kubenswrapper[4779]: I0320 16:15:26.808984 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:15:26 crc kubenswrapper[4779]: E0320 16:15:26.809623 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:15:36 crc kubenswrapper[4779]: E0320 16:15:36.493516 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice/crio-22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice\": RecentStats: unable to find data in memory cache]" Mar 20 16:15:37 crc kubenswrapper[4779]: I0320 16:15:37.808902 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:15:37 crc kubenswrapper[4779]: E0320 16:15:37.809550 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:15:44 crc kubenswrapper[4779]: I0320 16:15:44.528416 4779 scope.go:117] "RemoveContainer" containerID="7e78b296ba2332d79bf0e80128d52c070bac2f3aaf8e414b43050ee046d02218" Mar 20 16:15:46 crc kubenswrapper[4779]: E0320 16:15:46.760308 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice/crio-22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice\": RecentStats: unable to find data in memory cache]" Mar 20 16:15:52 crc kubenswrapper[4779]: I0320 16:15:52.809229 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:15:52 crc kubenswrapper[4779]: E0320 16:15:52.809731 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:15:56 crc kubenswrapper[4779]: E0320 16:15:56.989174 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ef565d_821f_4b5a_83ff_45bd3d562ebd.slice/crio-22300eb55e061a9e91f9ae3944e1311e887ff628c80aac1ba3051a02757e2f20\": RecentStats: unable to find data in memory cache]" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.141288 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567056-tbwhl"] Mar 20 16:16:00 crc kubenswrapper[4779]: E0320 16:16:00.142229 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ef565d-821f-4b5a-83ff-45bd3d562ebd" containerName="collect-profiles" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.142244 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ef565d-821f-4b5a-83ff-45bd3d562ebd" containerName="collect-profiles" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.142445 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ef565d-821f-4b5a-83ff-45bd3d562ebd" containerName="collect-profiles" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.143199 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-tbwhl" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.145593 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.145639 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.145693 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.152933 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-tbwhl"] Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.210131 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbb9p\" (UniqueName: \"kubernetes.io/projected/ad13a037-8a3f-41b1-9762-7a3f6defbf78-kube-api-access-rbb9p\") pod \"auto-csr-approver-29567056-tbwhl\" (UID: \"ad13a037-8a3f-41b1-9762-7a3f6defbf78\") " pod="openshift-infra/auto-csr-approver-29567056-tbwhl" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.312224 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbb9p\" (UniqueName: \"kubernetes.io/projected/ad13a037-8a3f-41b1-9762-7a3f6defbf78-kube-api-access-rbb9p\") pod \"auto-csr-approver-29567056-tbwhl\" (UID: \"ad13a037-8a3f-41b1-9762-7a3f6defbf78\") " pod="openshift-infra/auto-csr-approver-29567056-tbwhl" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.330659 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbb9p\" (UniqueName: \"kubernetes.io/projected/ad13a037-8a3f-41b1-9762-7a3f6defbf78-kube-api-access-rbb9p\") pod \"auto-csr-approver-29567056-tbwhl\" (UID: \"ad13a037-8a3f-41b1-9762-7a3f6defbf78\") " pod="openshift-infra/auto-csr-approver-29567056-tbwhl" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.496034 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-tbwhl" Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.924039 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-tbwhl"] Mar 20 16:16:00 crc kubenswrapper[4779]: I0320 16:16:00.936996 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:16:01 crc kubenswrapper[4779]: I0320 16:16:01.346242 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-tbwhl" event={"ID":"ad13a037-8a3f-41b1-9762-7a3f6defbf78","Type":"ContainerStarted","Data":"e0265b45c1cefe09a9c3eec74b7f6c065740173f840c5f7d757e30fd24cf668b"} Mar 20 16:16:03 crc kubenswrapper[4779]: I0320 16:16:03.365848 4779 generic.go:334] "Generic (PLEG): container finished" podID="ad13a037-8a3f-41b1-9762-7a3f6defbf78" containerID="71e1aaf7ca85e65be539761e70f4d7b4f3af3ad0afdbd518b6839979ab588783" exitCode=0 Mar 20 16:16:03 crc kubenswrapper[4779]: I0320 16:16:03.365951 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-tbwhl" event={"ID":"ad13a037-8a3f-41b1-9762-7a3f6defbf78","Type":"ContainerDied","Data":"71e1aaf7ca85e65be539761e70f4d7b4f3af3ad0afdbd518b6839979ab588783"} Mar 20 16:16:04 crc kubenswrapper[4779]: I0320 16:16:04.783595 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-tbwhl" Mar 20 16:16:04 crc kubenswrapper[4779]: I0320 16:16:04.799319 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbb9p\" (UniqueName: \"kubernetes.io/projected/ad13a037-8a3f-41b1-9762-7a3f6defbf78-kube-api-access-rbb9p\") pod \"ad13a037-8a3f-41b1-9762-7a3f6defbf78\" (UID: \"ad13a037-8a3f-41b1-9762-7a3f6defbf78\") " Mar 20 16:16:04 crc kubenswrapper[4779]: I0320 16:16:04.813027 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad13a037-8a3f-41b1-9762-7a3f6defbf78-kube-api-access-rbb9p" (OuterVolumeSpecName: "kube-api-access-rbb9p") pod "ad13a037-8a3f-41b1-9762-7a3f6defbf78" (UID: "ad13a037-8a3f-41b1-9762-7a3f6defbf78"). InnerVolumeSpecName "kube-api-access-rbb9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:16:04 crc kubenswrapper[4779]: I0320 16:16:04.901706 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbb9p\" (UniqueName: \"kubernetes.io/projected/ad13a037-8a3f-41b1-9762-7a3f6defbf78-kube-api-access-rbb9p\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:05 crc kubenswrapper[4779]: I0320 16:16:05.381634 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-tbwhl" event={"ID":"ad13a037-8a3f-41b1-9762-7a3f6defbf78","Type":"ContainerDied","Data":"e0265b45c1cefe09a9c3eec74b7f6c065740173f840c5f7d757e30fd24cf668b"} Mar 20 16:16:05 crc kubenswrapper[4779]: I0320 16:16:05.382022 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0265b45c1cefe09a9c3eec74b7f6c065740173f840c5f7d757e30fd24cf668b" Mar 20 16:16:05 crc kubenswrapper[4779]: I0320 16:16:05.381943 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-tbwhl" Mar 20 16:16:05 crc kubenswrapper[4779]: I0320 16:16:05.868652 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-w4ffw"] Mar 20 16:16:05 crc kubenswrapper[4779]: I0320 16:16:05.877707 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-w4ffw"] Mar 20 16:16:06 crc kubenswrapper[4779]: I0320 16:16:06.808826 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:16:06 crc kubenswrapper[4779]: E0320 16:16:06.809177 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:16:07 crc kubenswrapper[4779]: I0320 16:16:07.829437 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ec02b7-0eff-4287-89d6-462fff4f37f5" path="/var/lib/kubelet/pods/b3ec02b7-0eff-4287-89d6-462fff4f37f5/volumes" Mar 20 16:16:18 crc kubenswrapper[4779]: I0320 16:16:18.810268 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:16:18 crc kubenswrapper[4779]: E0320 16:16:18.811063 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:16:31 crc kubenswrapper[4779]: I0320 16:16:31.830297 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:16:31 crc kubenswrapper[4779]: E0320 16:16:31.831528 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:16:44 crc kubenswrapper[4779]: I0320 16:16:44.604732 4779 scope.go:117] "RemoveContainer" containerID="5af3d1f7aeffdd0681f58c8567ad72e3db631a389542c85b739b948d69c30c44" Mar 20 16:16:46 crc kubenswrapper[4779]: I0320 16:16:46.809383 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:16:46 crc kubenswrapper[4779]: E0320 16:16:46.810080 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:16:58 crc kubenswrapper[4779]: I0320 16:16:58.809764 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:16:58 crc kubenswrapper[4779]: E0320 16:16:58.810970 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:17:13 crc kubenswrapper[4779]: I0320 16:17:13.819080 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:17:13 crc kubenswrapper[4779]: E0320 16:17:13.819946 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:17:27 crc kubenswrapper[4779]: I0320 16:17:27.812997 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:17:27 crc kubenswrapper[4779]: E0320 16:17:27.813783 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:17:40 crc kubenswrapper[4779]: I0320 16:17:40.809478 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:17:40 crc kubenswrapper[4779]: E0320 16:17:40.810331 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:17:51 crc kubenswrapper[4779]: I0320 16:17:51.809683 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:17:51 crc kubenswrapper[4779]: E0320 16:17:51.810459 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.176270 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567058-8tgdn"] Mar 20 16:18:00 crc kubenswrapper[4779]: E0320 16:18:00.178030 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad13a037-8a3f-41b1-9762-7a3f6defbf78" containerName="oc" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.178137 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad13a037-8a3f-41b1-9762-7a3f6defbf78" containerName="oc" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.178454 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad13a037-8a3f-41b1-9762-7a3f6defbf78" containerName="oc" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.180207 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-8tgdn" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.182244 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.182490 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.183827 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.194007 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-8tgdn"] Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.236998 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qrxw\" (UniqueName: \"kubernetes.io/projected/ea1dfcb0-6dfd-45f1-babd-e3cb75577774-kube-api-access-5qrxw\") pod \"auto-csr-approver-29567058-8tgdn\" (UID: \"ea1dfcb0-6dfd-45f1-babd-e3cb75577774\") " pod="openshift-infra/auto-csr-approver-29567058-8tgdn" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.339120 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qrxw\" (UniqueName: \"kubernetes.io/projected/ea1dfcb0-6dfd-45f1-babd-e3cb75577774-kube-api-access-5qrxw\") pod \"auto-csr-approver-29567058-8tgdn\" (UID: \"ea1dfcb0-6dfd-45f1-babd-e3cb75577774\") " pod="openshift-infra/auto-csr-approver-29567058-8tgdn" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.358316 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qrxw\" (UniqueName: \"kubernetes.io/projected/ea1dfcb0-6dfd-45f1-babd-e3cb75577774-kube-api-access-5qrxw\") pod \"auto-csr-approver-29567058-8tgdn\" (UID: \"ea1dfcb0-6dfd-45f1-babd-e3cb75577774\") " pod="openshift-infra/auto-csr-approver-29567058-8tgdn" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.499893 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-8tgdn" Mar 20 16:18:00 crc kubenswrapper[4779]: I0320 16:18:00.953595 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-8tgdn"] Mar 20 16:18:01 crc kubenswrapper[4779]: I0320 16:18:01.400022 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-8tgdn" event={"ID":"ea1dfcb0-6dfd-45f1-babd-e3cb75577774","Type":"ContainerStarted","Data":"69716d79119f8e4701de4fbf7324e472fc22a9272552b13a2e5d005ce2ea9079"} Mar 20 16:18:03 crc kubenswrapper[4779]: I0320 16:18:03.417123 4779 generic.go:334] "Generic (PLEG): container finished" podID="ea1dfcb0-6dfd-45f1-babd-e3cb75577774" containerID="1b23a40afc236e0483a4ee53b47415d85ce4c33381d22bacbb71add04cfe2b3c" exitCode=0 Mar 20 16:18:03 crc kubenswrapper[4779]: I0320 16:18:03.417184 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-8tgdn" event={"ID":"ea1dfcb0-6dfd-45f1-babd-e3cb75577774","Type":"ContainerDied","Data":"1b23a40afc236e0483a4ee53b47415d85ce4c33381d22bacbb71add04cfe2b3c"} Mar 20 16:18:04 crc kubenswrapper[4779]: I0320 16:18:04.774723 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-8tgdn" Mar 20 16:18:04 crc kubenswrapper[4779]: I0320 16:18:04.809335 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:18:04 crc kubenswrapper[4779]: E0320 16:18:04.809636 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:18:04 crc kubenswrapper[4779]: I0320 16:18:04.933812 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qrxw\" (UniqueName: \"kubernetes.io/projected/ea1dfcb0-6dfd-45f1-babd-e3cb75577774-kube-api-access-5qrxw\") pod \"ea1dfcb0-6dfd-45f1-babd-e3cb75577774\" (UID: \"ea1dfcb0-6dfd-45f1-babd-e3cb75577774\") " Mar 20 16:18:04 crc kubenswrapper[4779]: I0320 16:18:04.939626 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1dfcb0-6dfd-45f1-babd-e3cb75577774-kube-api-access-5qrxw" (OuterVolumeSpecName: "kube-api-access-5qrxw") pod "ea1dfcb0-6dfd-45f1-babd-e3cb75577774" (UID: "ea1dfcb0-6dfd-45f1-babd-e3cb75577774"). InnerVolumeSpecName "kube-api-access-5qrxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:18:05 crc kubenswrapper[4779]: I0320 16:18:05.037100 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qrxw\" (UniqueName: \"kubernetes.io/projected/ea1dfcb0-6dfd-45f1-babd-e3cb75577774-kube-api-access-5qrxw\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:05 crc kubenswrapper[4779]: I0320 16:18:05.439903 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-8tgdn" event={"ID":"ea1dfcb0-6dfd-45f1-babd-e3cb75577774","Type":"ContainerDied","Data":"69716d79119f8e4701de4fbf7324e472fc22a9272552b13a2e5d005ce2ea9079"} Mar 20 16:18:05 crc kubenswrapper[4779]: I0320 16:18:05.439948 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69716d79119f8e4701de4fbf7324e472fc22a9272552b13a2e5d005ce2ea9079" Mar 20 16:18:05 crc kubenswrapper[4779]: I0320 16:18:05.440017 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-8tgdn" Mar 20 16:18:05 crc kubenswrapper[4779]: I0320 16:18:05.849922 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fnhnz"] Mar 20 16:18:05 crc kubenswrapper[4779]: I0320 16:18:05.861038 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fnhnz"] Mar 20 16:18:07 crc kubenswrapper[4779]: I0320 16:18:07.819507 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9" path="/var/lib/kubelet/pods/3f4e99dc-e4a7-4bbe-ae42-7b46c0875bc9/volumes" Mar 20 16:18:19 crc kubenswrapper[4779]: I0320 16:18:19.809087 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:18:19 crc kubenswrapper[4779]: E0320 16:18:19.809980 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.426942 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4rh2q"] Mar 20 16:18:31 crc kubenswrapper[4779]: E0320 16:18:31.428186 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1dfcb0-6dfd-45f1-babd-e3cb75577774" containerName="oc" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.428203 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1dfcb0-6dfd-45f1-babd-e3cb75577774" containerName="oc" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.428484 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1dfcb0-6dfd-45f1-babd-e3cb75577774" containerName="oc" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.430280 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.439372 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rh2q"] Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.451450 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szktl\" (UniqueName: \"kubernetes.io/projected/02776554-42a1-498a-afd0-b5484bc05631-kube-api-access-szktl\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.451500 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-catalog-content\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.451599 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-utilities\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.554133 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szktl\" (UniqueName: \"kubernetes.io/projected/02776554-42a1-498a-afd0-b5484bc05631-kube-api-access-szktl\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.554193 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-catalog-content\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.554278 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-utilities\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.554720 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-catalog-content\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.554743 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-utilities\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.575329 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szktl\" (UniqueName: \"kubernetes.io/projected/02776554-42a1-498a-afd0-b5484bc05631-kube-api-access-szktl\") pod \"redhat-marketplace-4rh2q\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.632042 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ghk4r"] Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.634186 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.647444 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ghk4r"] Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.656139 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-utilities\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.656183 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8z6f\" (UniqueName: \"kubernetes.io/projected/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-kube-api-access-d8z6f\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.656220 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-catalog-content\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.756506 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.758977 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-utilities\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.759027 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8z6f\" (UniqueName: \"kubernetes.io/projected/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-kube-api-access-d8z6f\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.759066 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-catalog-content\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.759911 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-utilities\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.760015 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-catalog-content\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.789555 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8z6f\" (UniqueName: \"kubernetes.io/projected/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-kube-api-access-d8z6f\") pod \"redhat-operators-ghk4r\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:31 crc kubenswrapper[4779]: I0320 16:18:31.966574 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:32 crc kubenswrapper[4779]: I0320 16:18:32.305729 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rh2q"] Mar 20 16:18:32 crc kubenswrapper[4779]: I0320 16:18:32.567372 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ghk4r"] Mar 20 16:18:32 crc kubenswrapper[4779]: W0320 16:18:32.571527 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a3dfa7b_74de_49b6_b47b_bd6492a35db6.slice/crio-fac9c7a277d19a1f6fc9cbf06b4f491305a5faf5d325e9f63dc1c4743ca27342 WatchSource:0}: Error finding container fac9c7a277d19a1f6fc9cbf06b4f491305a5faf5d325e9f63dc1c4743ca27342: Status 404 returned error can't find the container with id fac9c7a277d19a1f6fc9cbf06b4f491305a5faf5d325e9f63dc1c4743ca27342 Mar 20 16:18:32 crc kubenswrapper[4779]: I0320 16:18:32.688472 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghk4r" event={"ID":"6a3dfa7b-74de-49b6-b47b-bd6492a35db6","Type":"ContainerStarted","Data":"fac9c7a277d19a1f6fc9cbf06b4f491305a5faf5d325e9f63dc1c4743ca27342"} Mar 20 16:18:32 crc kubenswrapper[4779]: I0320 16:18:32.690122 4779 generic.go:334] "Generic (PLEG): container finished" podID="02776554-42a1-498a-afd0-b5484bc05631" containerID="b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14" exitCode=0 Mar 20 16:18:32 crc kubenswrapper[4779]: I0320 16:18:32.690167 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rh2q" event={"ID":"02776554-42a1-498a-afd0-b5484bc05631","Type":"ContainerDied","Data":"b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14"} Mar 20 16:18:32 crc kubenswrapper[4779]: I0320 16:18:32.690194 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rh2q" event={"ID":"02776554-42a1-498a-afd0-b5484bc05631","Type":"ContainerStarted","Data":"40910c54130fc1ee579c533bb84277aabf6207d2d220ab9a41dc66dbd07ed3b5"} Mar 20 16:18:33 crc kubenswrapper[4779]: I0320 16:18:33.700624 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rh2q" event={"ID":"02776554-42a1-498a-afd0-b5484bc05631","Type":"ContainerStarted","Data":"8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e"} Mar 20 16:18:33 crc kubenswrapper[4779]: I0320 16:18:33.702196 4779 generic.go:334] "Generic (PLEG): container finished" podID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerID="2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665" exitCode=0 Mar 20 16:18:33 crc kubenswrapper[4779]: I0320 16:18:33.702256 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghk4r" event={"ID":"6a3dfa7b-74de-49b6-b47b-bd6492a35db6","Type":"ContainerDied","Data":"2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665"} Mar 20 16:18:33 crc kubenswrapper[4779]: I0320 16:18:33.818049 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:18:33 crc kubenswrapper[4779]: E0320 16:18:33.818323 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:18:35 crc kubenswrapper[4779]: I0320 16:18:35.728073 4779 generic.go:334] "Generic (PLEG): container finished" podID="02776554-42a1-498a-afd0-b5484bc05631" containerID="8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e" exitCode=0 Mar 20 16:18:35 crc kubenswrapper[4779]: I0320 16:18:35.728161 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rh2q" event={"ID":"02776554-42a1-498a-afd0-b5484bc05631","Type":"ContainerDied","Data":"8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e"} Mar 20 16:18:35 crc kubenswrapper[4779]: I0320 16:18:35.735976 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghk4r" event={"ID":"6a3dfa7b-74de-49b6-b47b-bd6492a35db6","Type":"ContainerStarted","Data":"7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe"} Mar 20 16:18:36 crc kubenswrapper[4779]: I0320 16:18:36.746147 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rh2q" event={"ID":"02776554-42a1-498a-afd0-b5484bc05631","Type":"ContainerStarted","Data":"83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058"} Mar 20 16:18:36 crc kubenswrapper[4779]: I0320 16:18:36.768457 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4rh2q" podStartSLOduration=2.054924575 podStartE2EDuration="5.768438934s" podCreationTimestamp="2026-03-20 16:18:31 +0000 UTC" firstStartedPulling="2026-03-20 16:18:32.692663974 +0000 UTC m=+3329.655179774" lastFinishedPulling="2026-03-20 16:18:36.406178333 +0000 UTC m=+3333.368694133" observedRunningTime="2026-03-20 16:18:36.761558677 +0000 UTC m=+3333.724074467" watchObservedRunningTime="2026-03-20 16:18:36.768438934 +0000 UTC m=+3333.730954734" Mar 20 16:18:41 crc kubenswrapper[4779]: I0320 16:18:41.758162 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:41 crc kubenswrapper[4779]: I0320 16:18:41.758814 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:41 crc kubenswrapper[4779]: I0320 16:18:41.789703 4779 generic.go:334] "Generic (PLEG): container finished" podID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerID="7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe" exitCode=0 Mar 20 16:18:41 crc kubenswrapper[4779]: I0320 16:18:41.789751 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghk4r" event={"ID":"6a3dfa7b-74de-49b6-b47b-bd6492a35db6","Type":"ContainerDied","Data":"7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe"} Mar 20 16:18:42 crc kubenswrapper[4779]: I0320 16:18:42.799838 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghk4r" event={"ID":"6a3dfa7b-74de-49b6-b47b-bd6492a35db6","Type":"ContainerStarted","Data":"7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3"} Mar 20 16:18:42 crc kubenswrapper[4779]: I0320 16:18:42.816326 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-4rh2q" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="registry-server" probeResult="failure" output=< Mar 20 16:18:42 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 16:18:42 crc kubenswrapper[4779]: > Mar 20 16:18:42 crc kubenswrapper[4779]: I0320 16:18:42.826822 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ghk4r" podStartSLOduration=3.340728709 podStartE2EDuration="11.826800867s" podCreationTimestamp="2026-03-20 16:18:31 +0000 UTC" firstStartedPulling="2026-03-20 16:18:33.70396844 +0000 UTC m=+3330.666484240" lastFinishedPulling="2026-03-20 16:18:42.190040598 +0000 UTC m=+3339.152556398" observedRunningTime="2026-03-20 16:18:42.821401805 +0000 UTC m=+3339.783917595" watchObservedRunningTime="2026-03-20 16:18:42.826800867 +0000 UTC m=+3339.789316667" Mar 20 16:18:44 crc kubenswrapper[4779]: I0320 16:18:44.698361 4779 scope.go:117] "RemoveContainer" containerID="c591d48bb59c22bdf6540d24d359476316163842c02a7e45a6925091c7c19144" Mar 20 16:18:47 crc kubenswrapper[4779]: I0320 16:18:47.809909 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:18:47 crc kubenswrapper[4779]: E0320 16:18:47.811384 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:18:51 crc kubenswrapper[4779]: I0320 16:18:51.807952 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:51 crc kubenswrapper[4779]: I0320 16:18:51.853054 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:51 crc kubenswrapper[4779]: I0320 16:18:51.967257 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:51 crc kubenswrapper[4779]: I0320 16:18:51.967291 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:52 crc kubenswrapper[4779]: I0320 16:18:52.011602 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:52 crc kubenswrapper[4779]: I0320 16:18:52.048796 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rh2q"] Mar 20 16:18:52 crc kubenswrapper[4779]: I0320 16:18:52.878647 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4rh2q" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="registry-server" containerID="cri-o://83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058" gracePeriod=2 Mar 20 16:18:52 crc kubenswrapper[4779]: I0320 16:18:52.931535 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.533625 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.712987 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-utilities\") pod \"02776554-42a1-498a-afd0-b5484bc05631\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.713039 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szktl\" (UniqueName: \"kubernetes.io/projected/02776554-42a1-498a-afd0-b5484bc05631-kube-api-access-szktl\") pod \"02776554-42a1-498a-afd0-b5484bc05631\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.713340 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-catalog-content\") pod \"02776554-42a1-498a-afd0-b5484bc05631\" (UID: \"02776554-42a1-498a-afd0-b5484bc05631\") " Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.714342 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-utilities" (OuterVolumeSpecName: "utilities") pod "02776554-42a1-498a-afd0-b5484bc05631" (UID: "02776554-42a1-498a-afd0-b5484bc05631"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.723760 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02776554-42a1-498a-afd0-b5484bc05631-kube-api-access-szktl" (OuterVolumeSpecName: "kube-api-access-szktl") pod "02776554-42a1-498a-afd0-b5484bc05631" (UID: "02776554-42a1-498a-afd0-b5484bc05631"). InnerVolumeSpecName "kube-api-access-szktl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.740749 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02776554-42a1-498a-afd0-b5484bc05631" (UID: "02776554-42a1-498a-afd0-b5484bc05631"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.817937 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.817971 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02776554-42a1-498a-afd0-b5484bc05631-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.817981 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szktl\" (UniqueName: \"kubernetes.io/projected/02776554-42a1-498a-afd0-b5484bc05631-kube-api-access-szktl\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.887682 4779 generic.go:334] "Generic (PLEG): container finished" podID="02776554-42a1-498a-afd0-b5484bc05631" containerID="83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058" exitCode=0 Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.887796 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rh2q" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.887841 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rh2q" event={"ID":"02776554-42a1-498a-afd0-b5484bc05631","Type":"ContainerDied","Data":"83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058"} Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.888546 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rh2q" event={"ID":"02776554-42a1-498a-afd0-b5484bc05631","Type":"ContainerDied","Data":"40910c54130fc1ee579c533bb84277aabf6207d2d220ab9a41dc66dbd07ed3b5"} Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.888567 4779 scope.go:117] "RemoveContainer" containerID="83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.911371 4779 scope.go:117] "RemoveContainer" containerID="8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.916312 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rh2q"] Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.926823 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rh2q"] Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.934466 4779 scope.go:117] "RemoveContainer" containerID="b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.981364 4779 scope.go:117] "RemoveContainer" containerID="83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058" Mar 20 16:18:53 crc kubenswrapper[4779]: E0320 16:18:53.982427 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058\": container with ID starting with 83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058 not found: ID does not exist" containerID="83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.982471 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058"} err="failed to get container status \"83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058\": rpc error: code = NotFound desc = could not find container \"83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058\": container with ID starting with 83f742da0dd1ffb47cf8580bb7921fbc0aed045b0939eede766723f2015e1058 not found: ID does not exist" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.982524 4779 scope.go:117] "RemoveContainer" containerID="8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e" Mar 20 16:18:53 crc kubenswrapper[4779]: E0320 16:18:53.982922 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e\": container with ID starting with 8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e not found: ID does not exist" containerID="8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.982943 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e"} err="failed to get container status \"8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e\": rpc error: code = NotFound desc = could not find container \"8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e\": container with ID starting with 8e7a2f5231d78b20c7db723d2af0e5d753dc6ce8e21675203ba5c25896b1f95e not found: ID does not exist" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.982955 4779 scope.go:117] "RemoveContainer" containerID="b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14" Mar 20 16:18:53 crc kubenswrapper[4779]: E0320 16:18:53.983210 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14\": container with ID starting with b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14 not found: ID does not exist" containerID="b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14" Mar 20 16:18:53 crc kubenswrapper[4779]: I0320 16:18:53.983252 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14"} err="failed to get container status \"b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14\": rpc error: code = NotFound desc = could not find container \"b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14\": container with ID starting with b6150b3265238858a2ee80a22e6f51a2c6f836b69bd8180816170038652c4b14 not found: ID does not exist" Mar 20 16:18:54 crc kubenswrapper[4779]: I0320 16:18:54.240652 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ghk4r"] Mar 20 16:18:54 crc kubenswrapper[4779]: I0320 16:18:54.900311 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ghk4r" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerName="registry-server" containerID="cri-o://7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3" gracePeriod=2 Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.388522 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.448767 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-catalog-content\") pod \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.448932 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-utilities\") pod \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.449132 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8z6f\" (UniqueName: \"kubernetes.io/projected/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-kube-api-access-d8z6f\") pod \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\" (UID: \"6a3dfa7b-74de-49b6-b47b-bd6492a35db6\") " Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.449561 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-utilities" (OuterVolumeSpecName: "utilities") pod "6a3dfa7b-74de-49b6-b47b-bd6492a35db6" (UID: "6a3dfa7b-74de-49b6-b47b-bd6492a35db6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.449923 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.454215 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-kube-api-access-d8z6f" (OuterVolumeSpecName: "kube-api-access-d8z6f") pod "6a3dfa7b-74de-49b6-b47b-bd6492a35db6" (UID: "6a3dfa7b-74de-49b6-b47b-bd6492a35db6"). InnerVolumeSpecName "kube-api-access-d8z6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.551518 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8z6f\" (UniqueName: \"kubernetes.io/projected/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-kube-api-access-d8z6f\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.603237 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a3dfa7b-74de-49b6-b47b-bd6492a35db6" (UID: "6a3dfa7b-74de-49b6-b47b-bd6492a35db6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.652990 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3dfa7b-74de-49b6-b47b-bd6492a35db6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.822148 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02776554-42a1-498a-afd0-b5484bc05631" path="/var/lib/kubelet/pods/02776554-42a1-498a-afd0-b5484bc05631/volumes" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.915256 4779 generic.go:334] "Generic (PLEG): container finished" podID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerID="7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3" exitCode=0 Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.915295 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghk4r" event={"ID":"6a3dfa7b-74de-49b6-b47b-bd6492a35db6","Type":"ContainerDied","Data":"7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3"} Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.915573 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghk4r" event={"ID":"6a3dfa7b-74de-49b6-b47b-bd6492a35db6","Type":"ContainerDied","Data":"fac9c7a277d19a1f6fc9cbf06b4f491305a5faf5d325e9f63dc1c4743ca27342"} Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.915597 4779 scope.go:117] "RemoveContainer" containerID="7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.915363 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghk4r" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.937867 4779 scope.go:117] "RemoveContainer" containerID="7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe" Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.944256 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ghk4r"] Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.955515 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ghk4r"] Mar 20 16:18:55 crc kubenswrapper[4779]: I0320 16:18:55.963942 4779 scope.go:117] "RemoveContainer" containerID="2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665" Mar 20 16:18:56 crc kubenswrapper[4779]: I0320 16:18:56.028563 4779 scope.go:117] "RemoveContainer" containerID="7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3" Mar 20 16:18:56 crc kubenswrapper[4779]: E0320 16:18:56.029036 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3\": container with ID starting with 7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3 not found: ID does not exist" containerID="7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3" Mar 20 16:18:56 crc kubenswrapper[4779]: I0320 16:18:56.029075 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3"} err="failed to get container status \"7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3\": rpc error: code = NotFound desc = could not find container \"7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3\": container with ID starting with 7ff053040ef74099320579d75388ccb4a26a0bd43fa4fbc3d2e63759bfdc95b3 not found: ID does not exist" Mar 20 16:18:56 crc kubenswrapper[4779]: I0320 16:18:56.029099 4779 scope.go:117] "RemoveContainer" containerID="7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe" Mar 20 16:18:56 crc kubenswrapper[4779]: E0320 16:18:56.029533 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe\": container with ID starting with 7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe not found: ID does not exist" containerID="7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe" Mar 20 16:18:56 crc kubenswrapper[4779]: I0320 16:18:56.029567 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe"} err="failed to get container status \"7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe\": rpc error: code = NotFound desc = could not find container \"7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe\": container with ID starting with 7340ce2131a52364da6fabd5a8e23d0e4bdd06b3a4e5b34acb052719f0db2cfe not found: ID does not exist" Mar 20 16:18:56 crc kubenswrapper[4779]: I0320 16:18:56.029584 4779 scope.go:117] "RemoveContainer" containerID="2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665" Mar 20 16:18:56 crc kubenswrapper[4779]: E0320 16:18:56.029853 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665\": container with ID starting with 2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665 not found: ID does not exist" containerID="2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665" Mar 20 16:18:56 crc kubenswrapper[4779]: I0320 16:18:56.029878 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665"} err="failed to get container status \"2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665\": rpc error: code = NotFound desc = could not find container \"2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665\": container with ID starting with 2291acd6c62b5229ae9cf4d0bed0695628a42f8331336264d0bbe871c3027665 not found: ID does not exist" Mar 20 16:18:57 crc kubenswrapper[4779]: I0320 16:18:57.819100 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" path="/var/lib/kubelet/pods/6a3dfa7b-74de-49b6-b47b-bd6492a35db6/volumes" Mar 20 16:18:59 crc kubenswrapper[4779]: I0320 16:18:59.809600 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:18:59 crc kubenswrapper[4779]: E0320 16:18:59.810769 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:19:11 crc kubenswrapper[4779]: I0320 16:19:11.808653 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:19:11 crc kubenswrapper[4779]: E0320 16:19:11.809440 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:19:26 crc kubenswrapper[4779]: I0320 16:19:26.809057 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:19:27 crc kubenswrapper[4779]: I0320 16:19:27.176138 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"69f71fe86c460e6ddfed9470ce53b0c78d6258579872b569994718a9cc92ee8d"} Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.141671 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567060-plhzb"] Mar 20 16:20:00 crc kubenswrapper[4779]: E0320 16:20:00.142588 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="extract-utilities" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.142600 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="extract-utilities" Mar 20 16:20:00 crc kubenswrapper[4779]: E0320 16:20:00.142618 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerName="extract-content" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.142624 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerName="extract-content" Mar 20 16:20:00 crc kubenswrapper[4779]: E0320 16:20:00.142637 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="registry-server" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.142643 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="registry-server" Mar 20 16:20:00 crc kubenswrapper[4779]: E0320 16:20:00.142656 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerName="extract-utilities" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.142662 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerName="extract-utilities" Mar 20 16:20:00 crc kubenswrapper[4779]: E0320 16:20:00.142682 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="extract-content" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.142687 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="extract-content" Mar 20 16:20:00 crc kubenswrapper[4779]: E0320 16:20:00.142696 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerName="registry-server" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.142702 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerName="registry-server" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.142963 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3dfa7b-74de-49b6-b47b-bd6492a35db6" containerName="registry-server" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.142989 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="02776554-42a1-498a-afd0-b5484bc05631" containerName="registry-server" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.143738 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-plhzb" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.147577 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.148151 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.149064 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.155603 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-plhzb"] Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.242425 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm496\" (UniqueName: \"kubernetes.io/projected/f7bca85e-0c5f-40fa-9d27-291671d6ca7f-kube-api-access-pm496\") pod \"auto-csr-approver-29567060-plhzb\" (UID: \"f7bca85e-0c5f-40fa-9d27-291671d6ca7f\") " pod="openshift-infra/auto-csr-approver-29567060-plhzb" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.345227 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm496\" (UniqueName: \"kubernetes.io/projected/f7bca85e-0c5f-40fa-9d27-291671d6ca7f-kube-api-access-pm496\") pod \"auto-csr-approver-29567060-plhzb\" (UID: \"f7bca85e-0c5f-40fa-9d27-291671d6ca7f\") " pod="openshift-infra/auto-csr-approver-29567060-plhzb" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.363596 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm496\" (UniqueName: \"kubernetes.io/projected/f7bca85e-0c5f-40fa-9d27-291671d6ca7f-kube-api-access-pm496\") pod \"auto-csr-approver-29567060-plhzb\" (UID: \"f7bca85e-0c5f-40fa-9d27-291671d6ca7f\") " pod="openshift-infra/auto-csr-approver-29567060-plhzb" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.464045 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-plhzb" Mar 20 16:20:00 crc kubenswrapper[4779]: I0320 16:20:00.918768 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-plhzb"] Mar 20 16:20:01 crc kubenswrapper[4779]: I0320 16:20:01.482666 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-plhzb" event={"ID":"f7bca85e-0c5f-40fa-9d27-291671d6ca7f","Type":"ContainerStarted","Data":"3a73a80a65b8019178bf427b87a070f84c65e505adbc9aaba43317e9eecf5745"} Mar 20 16:20:03 crc kubenswrapper[4779]: I0320 16:20:03.508531 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-plhzb" event={"ID":"f7bca85e-0c5f-40fa-9d27-291671d6ca7f","Type":"ContainerStarted","Data":"fae44409707f43a0ddb78f7126a653009e991b10ad77afa4c9754fc964db7ac5"} Mar 20 16:20:03 crc kubenswrapper[4779]: I0320 16:20:03.527339 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567060-plhzb" podStartSLOduration=1.380614956 podStartE2EDuration="3.527319544s" podCreationTimestamp="2026-03-20 16:20:00 +0000 UTC" firstStartedPulling="2026-03-20 16:20:00.92136925 +0000 UTC m=+3417.883885150" lastFinishedPulling="2026-03-20 16:20:03.068073938 +0000 UTC m=+3420.030589738" observedRunningTime="2026-03-20 16:20:03.519476434 +0000 UTC m=+3420.481992234" watchObservedRunningTime="2026-03-20 16:20:03.527319544 +0000 UTC m=+3420.489835344" Mar 20 16:20:04 crc kubenswrapper[4779]: I0320 16:20:04.517523 4779 generic.go:334] "Generic (PLEG): container finished" podID="f7bca85e-0c5f-40fa-9d27-291671d6ca7f" containerID="fae44409707f43a0ddb78f7126a653009e991b10ad77afa4c9754fc964db7ac5" exitCode=0 Mar 20 16:20:04 crc kubenswrapper[4779]: I0320 16:20:04.517568 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-plhzb" event={"ID":"f7bca85e-0c5f-40fa-9d27-291671d6ca7f","Type":"ContainerDied","Data":"fae44409707f43a0ddb78f7126a653009e991b10ad77afa4c9754fc964db7ac5"} Mar 20 16:20:05 crc kubenswrapper[4779]: I0320 16:20:05.946307 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-plhzb" Mar 20 16:20:06 crc kubenswrapper[4779]: I0320 16:20:06.071429 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm496\" (UniqueName: \"kubernetes.io/projected/f7bca85e-0c5f-40fa-9d27-291671d6ca7f-kube-api-access-pm496\") pod \"f7bca85e-0c5f-40fa-9d27-291671d6ca7f\" (UID: \"f7bca85e-0c5f-40fa-9d27-291671d6ca7f\") " Mar 20 16:20:06 crc kubenswrapper[4779]: I0320 16:20:06.077654 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7bca85e-0c5f-40fa-9d27-291671d6ca7f-kube-api-access-pm496" (OuterVolumeSpecName: "kube-api-access-pm496") pod "f7bca85e-0c5f-40fa-9d27-291671d6ca7f" (UID: "f7bca85e-0c5f-40fa-9d27-291671d6ca7f"). InnerVolumeSpecName "kube-api-access-pm496". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:06 crc kubenswrapper[4779]: I0320 16:20:06.174525 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm496\" (UniqueName: \"kubernetes.io/projected/f7bca85e-0c5f-40fa-9d27-291671d6ca7f-kube-api-access-pm496\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:06 crc kubenswrapper[4779]: I0320 16:20:06.537570 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-plhzb" event={"ID":"f7bca85e-0c5f-40fa-9d27-291671d6ca7f","Type":"ContainerDied","Data":"3a73a80a65b8019178bf427b87a070f84c65e505adbc9aaba43317e9eecf5745"} Mar 20 16:20:06 crc kubenswrapper[4779]: I0320 16:20:06.537612 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a73a80a65b8019178bf427b87a070f84c65e505adbc9aaba43317e9eecf5745" Mar 20 16:20:06 crc kubenswrapper[4779]: I0320 16:20:06.537654 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-plhzb" Mar 20 16:20:06 crc kubenswrapper[4779]: I0320 16:20:06.587338 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-2s45s"] Mar 20 16:20:06 crc kubenswrapper[4779]: I0320 16:20:06.595196 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-2s45s"] Mar 20 16:20:07 crc kubenswrapper[4779]: I0320 16:20:07.818130 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7bfb129-2346-4e46-afb8-81c888be46dc" path="/var/lib/kubelet/pods/a7bfb129-2346-4e46-afb8-81c888be46dc/volumes" Mar 20 16:20:44 crc kubenswrapper[4779]: I0320 16:20:44.813297 4779 scope.go:117] "RemoveContainer" containerID="b329ddf0121ebabeb736a53e44487aaf5f8f3b3e28fb7b6ef0e7b4ff756d891e" Mar 20 16:21:55 crc kubenswrapper[4779]: I0320 16:21:55.150543 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:21:55 crc kubenswrapper[4779]: I0320 16:21:55.151029 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.284222 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4s5h2"] Mar 20 16:21:59 crc kubenswrapper[4779]: E0320 16:21:59.285507 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7bca85e-0c5f-40fa-9d27-291671d6ca7f" containerName="oc" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.285527 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bca85e-0c5f-40fa-9d27-291671d6ca7f" containerName="oc" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.285785 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7bca85e-0c5f-40fa-9d27-291671d6ca7f" containerName="oc" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.288136 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.298178 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4s5h2"] Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.324092 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-catalog-content\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.324500 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjvtz\" (UniqueName: \"kubernetes.io/projected/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-kube-api-access-cjvtz\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.324609 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-utilities\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.426118 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-utilities\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.426252 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-catalog-content\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.426297 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjvtz\" (UniqueName: \"kubernetes.io/projected/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-kube-api-access-cjvtz\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.426691 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-utilities\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.426798 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-catalog-content\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.450052 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjvtz\" (UniqueName: \"kubernetes.io/projected/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-kube-api-access-cjvtz\") pod \"community-operators-4s5h2\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:21:59 crc kubenswrapper[4779]: I0320 16:21:59.617996 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.152333 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567062-mcj2h"] Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.154368 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.156883 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.156883 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.156969 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.163077 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-mcj2h"] Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.177466 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4s5h2"] Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.238962 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nbrs\" (UniqueName: \"kubernetes.io/projected/b681c5dc-c089-4921-b334-1817f709ca6f-kube-api-access-5nbrs\") pod \"auto-csr-approver-29567062-mcj2h\" (UID: \"b681c5dc-c089-4921-b334-1817f709ca6f\") " pod="openshift-infra/auto-csr-approver-29567062-mcj2h" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.340842 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nbrs\" (UniqueName: \"kubernetes.io/projected/b681c5dc-c089-4921-b334-1817f709ca6f-kube-api-access-5nbrs\") pod \"auto-csr-approver-29567062-mcj2h\" (UID: \"b681c5dc-c089-4921-b334-1817f709ca6f\") " pod="openshift-infra/auto-csr-approver-29567062-mcj2h" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.369269 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nbrs\" (UniqueName: \"kubernetes.io/projected/b681c5dc-c089-4921-b334-1817f709ca6f-kube-api-access-5nbrs\") pod \"auto-csr-approver-29567062-mcj2h\" (UID: \"b681c5dc-c089-4921-b334-1817f709ca6f\") " pod="openshift-infra/auto-csr-approver-29567062-mcj2h" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.476577 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.517197 4779 generic.go:334] "Generic (PLEG): container finished" podID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerID="1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4" exitCode=0 Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.517245 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s5h2" event={"ID":"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4","Type":"ContainerDied","Data":"1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4"} Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.517278 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s5h2" event={"ID":"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4","Type":"ContainerStarted","Data":"54ce34c6c104b90470fdb9e3a6a1a74bec6e7c2e5382783df9571936f5afee6b"} Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.520473 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:22:00 crc kubenswrapper[4779]: I0320 16:22:00.937231 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-mcj2h"] Mar 20 16:22:01 crc kubenswrapper[4779]: I0320 16:22:01.528724 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s5h2" event={"ID":"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4","Type":"ContainerStarted","Data":"3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3"} Mar 20 16:22:01 crc kubenswrapper[4779]: I0320 16:22:01.531737 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" event={"ID":"b681c5dc-c089-4921-b334-1817f709ca6f","Type":"ContainerStarted","Data":"2cbfb46c470bc9200450b4bf4c16f64277bbe0ba518fed32a900bdb19e8123c1"} Mar 20 16:22:02 crc kubenswrapper[4779]: I0320 16:22:02.541573 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" event={"ID":"b681c5dc-c089-4921-b334-1817f709ca6f","Type":"ContainerStarted","Data":"390821a777ce4156d0e9577380d9870d306381cecad54f94f96d7332023a12d5"} Mar 20 16:22:02 crc kubenswrapper[4779]: I0320 16:22:02.544019 4779 generic.go:334] "Generic (PLEG): container finished" podID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerID="3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3" exitCode=0 Mar 20 16:22:02 crc kubenswrapper[4779]: I0320 16:22:02.544066 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s5h2" event={"ID":"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4","Type":"ContainerDied","Data":"3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3"} Mar 20 16:22:02 crc kubenswrapper[4779]: I0320 16:22:02.590009 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" podStartSLOduration=1.491149741 podStartE2EDuration="2.589986655s" podCreationTimestamp="2026-03-20 16:22:00 +0000 UTC" firstStartedPulling="2026-03-20 16:22:00.947012382 +0000 UTC m=+3537.909528182" lastFinishedPulling="2026-03-20 16:22:02.045849296 +0000 UTC m=+3539.008365096" observedRunningTime="2026-03-20 16:22:02.560427806 +0000 UTC m=+3539.522943606" watchObservedRunningTime="2026-03-20 16:22:02.589986655 +0000 UTC m=+3539.552502455" Mar 20 16:22:03 crc kubenswrapper[4779]: I0320 16:22:03.555166 4779 generic.go:334] "Generic (PLEG): container finished" podID="b681c5dc-c089-4921-b334-1817f709ca6f" containerID="390821a777ce4156d0e9577380d9870d306381cecad54f94f96d7332023a12d5" exitCode=0 Mar 20 16:22:03 crc kubenswrapper[4779]: I0320 16:22:03.555223 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" event={"ID":"b681c5dc-c089-4921-b334-1817f709ca6f","Type":"ContainerDied","Data":"390821a777ce4156d0e9577380d9870d306381cecad54f94f96d7332023a12d5"} Mar 20 16:22:03 crc kubenswrapper[4779]: I0320 16:22:03.558649 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s5h2" event={"ID":"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4","Type":"ContainerStarted","Data":"4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0"} Mar 20 16:22:03 crc kubenswrapper[4779]: I0320 16:22:03.599501 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4s5h2" podStartSLOduration=2.211210651 podStartE2EDuration="4.599484008s" podCreationTimestamp="2026-03-20 16:21:59 +0000 UTC" firstStartedPulling="2026-03-20 16:22:00.520207684 +0000 UTC m=+3537.482723484" lastFinishedPulling="2026-03-20 16:22:02.908481041 +0000 UTC m=+3539.870996841" observedRunningTime="2026-03-20 16:22:03.595190993 +0000 UTC m=+3540.557706793" watchObservedRunningTime="2026-03-20 16:22:03.599484008 +0000 UTC m=+3540.561999808" Mar 20 16:22:04 crc kubenswrapper[4779]: I0320 16:22:04.961718 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.044150 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nbrs\" (UniqueName: \"kubernetes.io/projected/b681c5dc-c089-4921-b334-1817f709ca6f-kube-api-access-5nbrs\") pod \"b681c5dc-c089-4921-b334-1817f709ca6f\" (UID: \"b681c5dc-c089-4921-b334-1817f709ca6f\") " Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.049829 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b681c5dc-c089-4921-b334-1817f709ca6f-kube-api-access-5nbrs" (OuterVolumeSpecName: "kube-api-access-5nbrs") pod "b681c5dc-c089-4921-b334-1817f709ca6f" (UID: "b681c5dc-c089-4921-b334-1817f709ca6f"). InnerVolumeSpecName "kube-api-access-5nbrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.146525 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nbrs\" (UniqueName: \"kubernetes.io/projected/b681c5dc-c089-4921-b334-1817f709ca6f-kube-api-access-5nbrs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.583672 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" event={"ID":"b681c5dc-c089-4921-b334-1817f709ca6f","Type":"ContainerDied","Data":"2cbfb46c470bc9200450b4bf4c16f64277bbe0ba518fed32a900bdb19e8123c1"} Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.584063 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cbfb46c470bc9200450b4bf4c16f64277bbe0ba518fed32a900bdb19e8123c1" Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.583776 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-mcj2h" Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.660351 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-tbwhl"] Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.672753 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-tbwhl"] Mar 20 16:22:05 crc kubenswrapper[4779]: E0320 16:22:05.804885 4779 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb681c5dc_c089_4921_b334_1817f709ca6f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb681c5dc_c089_4921_b334_1817f709ca6f.slice/crio-2cbfb46c470bc9200450b4bf4c16f64277bbe0ba518fed32a900bdb19e8123c1\": RecentStats: unable to find data in memory cache]" Mar 20 16:22:05 crc kubenswrapper[4779]: I0320 16:22:05.821734 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad13a037-8a3f-41b1-9762-7a3f6defbf78" path="/var/lib/kubelet/pods/ad13a037-8a3f-41b1-9762-7a3f6defbf78/volumes" Mar 20 16:22:09 crc kubenswrapper[4779]: I0320 16:22:09.618657 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:22:09 crc kubenswrapper[4779]: I0320 16:22:09.619003 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:22:09 crc kubenswrapper[4779]: I0320 16:22:09.663190 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:22:10 crc kubenswrapper[4779]: I0320 16:22:10.668004 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:22:10 crc kubenswrapper[4779]: I0320 16:22:10.735139 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4s5h2"] Mar 20 16:22:12 crc kubenswrapper[4779]: I0320 16:22:12.640676 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4s5h2" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerName="registry-server" containerID="cri-o://4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0" gracePeriod=2 Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.117768 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.208147 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjvtz\" (UniqueName: \"kubernetes.io/projected/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-kube-api-access-cjvtz\") pod \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.208557 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-utilities\") pod \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.208641 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-catalog-content\") pod \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\" (UID: \"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4\") " Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.209663 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-utilities" (OuterVolumeSpecName: "utilities") pod "1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" (UID: "1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.214398 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-kube-api-access-cjvtz" (OuterVolumeSpecName: "kube-api-access-cjvtz") pod "1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" (UID: "1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4"). InnerVolumeSpecName "kube-api-access-cjvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.258567 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" (UID: "1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.311795 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.311835 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.311849 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjvtz\" (UniqueName: \"kubernetes.io/projected/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4-kube-api-access-cjvtz\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.652298 4779 generic.go:334] "Generic (PLEG): container finished" podID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerID="4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0" exitCode=0 Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.652345 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s5h2" event={"ID":"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4","Type":"ContainerDied","Data":"4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0"} Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.652377 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4s5h2" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.652445 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s5h2" event={"ID":"1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4","Type":"ContainerDied","Data":"54ce34c6c104b90470fdb9e3a6a1a74bec6e7c2e5382783df9571936f5afee6b"} Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.652467 4779 scope.go:117] "RemoveContainer" containerID="4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.682576 4779 scope.go:117] "RemoveContainer" containerID="3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.687556 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4s5h2"] Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.698588 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4s5h2"] Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.704804 4779 scope.go:117] "RemoveContainer" containerID="1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.748366 4779 scope.go:117] "RemoveContainer" containerID="4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0" Mar 20 16:22:13 crc kubenswrapper[4779]: E0320 16:22:13.748789 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0\": container with ID starting with 4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0 not found: ID does not exist" containerID="4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.748823 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0"} err="failed to get container status \"4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0\": rpc error: code = NotFound desc = could not find container \"4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0\": container with ID starting with 4734239caed2a9cd0c0c3d349a8426b27bcf1e8fa7de87d8a8ea9286c671e3b0 not found: ID does not exist" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.748846 4779 scope.go:117] "RemoveContainer" containerID="3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3" Mar 20 16:22:13 crc kubenswrapper[4779]: E0320 16:22:13.749062 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3\": container with ID starting with 3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3 not found: ID does not exist" containerID="3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.749120 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3"} err="failed to get container status \"3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3\": rpc error: code = NotFound desc = could not find container \"3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3\": container with ID starting with 3ea4a9a606dd1edce0f5b9dfe7314b108ace46adee98ff0efcad3757ae6914b3 not found: ID does not exist" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.749138 4779 scope.go:117] "RemoveContainer" containerID="1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4" Mar 20 16:22:13 crc kubenswrapper[4779]: E0320 16:22:13.749430 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4\": container with ID starting with 1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4 not found: ID does not exist" containerID="1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.749484 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4"} err="failed to get container status \"1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4\": rpc error: code = NotFound desc = could not find container \"1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4\": container with ID starting with 1c8402957ce7c05762379c8b67b677234b132062169d6adeaee87765509beaa4 not found: ID does not exist" Mar 20 16:22:13 crc kubenswrapper[4779]: I0320 16:22:13.825543 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" path="/var/lib/kubelet/pods/1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4/volumes" Mar 20 16:22:25 crc kubenswrapper[4779]: I0320 16:22:25.150033 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:22:25 crc kubenswrapper[4779]: I0320 16:22:25.150639 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:22:44 crc kubenswrapper[4779]: I0320 16:22:44.915168 4779 scope.go:117] "RemoveContainer" containerID="71e1aaf7ca85e65be539761e70f4d7b4f3af3ad0afdbd518b6839979ab588783" Mar 20 16:22:55 crc kubenswrapper[4779]: I0320 16:22:55.149960 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:22:55 crc kubenswrapper[4779]: I0320 16:22:55.150608 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:22:55 crc kubenswrapper[4779]: I0320 16:22:55.150653 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 16:22:55 crc kubenswrapper[4779]: I0320 16:22:55.151414 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69f71fe86c460e6ddfed9470ce53b0c78d6258579872b569994718a9cc92ee8d"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:22:55 crc kubenswrapper[4779]: I0320 16:22:55.151466 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://69f71fe86c460e6ddfed9470ce53b0c78d6258579872b569994718a9cc92ee8d" gracePeriod=600 Mar 20 16:22:56 crc kubenswrapper[4779]: I0320 16:22:56.199948 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="69f71fe86c460e6ddfed9470ce53b0c78d6258579872b569994718a9cc92ee8d" exitCode=0 Mar 20 16:22:56 crc kubenswrapper[4779]: I0320 16:22:56.200022 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"69f71fe86c460e6ddfed9470ce53b0c78d6258579872b569994718a9cc92ee8d"} Mar 20 16:22:56 crc kubenswrapper[4779]: I0320 16:22:56.200483 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d"} Mar 20 16:22:56 crc kubenswrapper[4779]: I0320 16:22:56.200511 4779 scope.go:117] "RemoveContainer" containerID="4a7db66155ab4c3a0ae7729cfdb28e2b784d98e18fea6cc1043b584edc741c1f" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.143811 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567064-8775c"] Mar 20 16:24:00 crc kubenswrapper[4779]: E0320 16:24:00.144839 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerName="extract-content" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.144856 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerName="extract-content" Mar 20 16:24:00 crc kubenswrapper[4779]: E0320 16:24:00.144876 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b681c5dc-c089-4921-b334-1817f709ca6f" containerName="oc" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.144884 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="b681c5dc-c089-4921-b334-1817f709ca6f" containerName="oc" Mar 20 16:24:00 crc kubenswrapper[4779]: E0320 16:24:00.144912 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerName="registry-server" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.144919 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerName="registry-server" Mar 20 16:24:00 crc kubenswrapper[4779]: E0320 16:24:00.144949 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerName="extract-utilities" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.144956 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerName="extract-utilities" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.145191 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="b681c5dc-c089-4921-b334-1817f709ca6f" containerName="oc" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.145207 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed7e3da-6ce0-40ae-8a11-6c8f00b9d5d4" containerName="registry-server" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.145932 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-8775c" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.147582 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.149279 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.149566 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.152352 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-8775c"] Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.196796 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrc5n\" (UniqueName: \"kubernetes.io/projected/00295003-7d10-44ad-a4a9-f77ad233896f-kube-api-access-rrc5n\") pod \"auto-csr-approver-29567064-8775c\" (UID: \"00295003-7d10-44ad-a4a9-f77ad233896f\") " pod="openshift-infra/auto-csr-approver-29567064-8775c" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.298424 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrc5n\" (UniqueName: \"kubernetes.io/projected/00295003-7d10-44ad-a4a9-f77ad233896f-kube-api-access-rrc5n\") pod \"auto-csr-approver-29567064-8775c\" (UID: \"00295003-7d10-44ad-a4a9-f77ad233896f\") " pod="openshift-infra/auto-csr-approver-29567064-8775c" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.319529 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrc5n\" (UniqueName: \"kubernetes.io/projected/00295003-7d10-44ad-a4a9-f77ad233896f-kube-api-access-rrc5n\") pod \"auto-csr-approver-29567064-8775c\" (UID: \"00295003-7d10-44ad-a4a9-f77ad233896f\") " pod="openshift-infra/auto-csr-approver-29567064-8775c" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.466397 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-8775c" Mar 20 16:24:00 crc kubenswrapper[4779]: I0320 16:24:00.924808 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-8775c"] Mar 20 16:24:01 crc kubenswrapper[4779]: I0320 16:24:01.868282 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-8775c" event={"ID":"00295003-7d10-44ad-a4a9-f77ad233896f","Type":"ContainerStarted","Data":"ee6c0d79f09dc795199dc105f37eaff69a3400c7b767a3fea19f3849572acbc1"} Mar 20 16:24:02 crc kubenswrapper[4779]: I0320 16:24:02.881730 4779 generic.go:334] "Generic (PLEG): container finished" podID="00295003-7d10-44ad-a4a9-f77ad233896f" containerID="2fea61cbcb277f267c820de5912eb2e153b9ab6348df4f067c4a083f76795624" exitCode=0 Mar 20 16:24:02 crc kubenswrapper[4779]: I0320 16:24:02.881821 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-8775c" event={"ID":"00295003-7d10-44ad-a4a9-f77ad233896f","Type":"ContainerDied","Data":"2fea61cbcb277f267c820de5912eb2e153b9ab6348df4f067c4a083f76795624"} Mar 20 16:24:04 crc kubenswrapper[4779]: I0320 16:24:04.331643 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-8775c" Mar 20 16:24:04 crc kubenswrapper[4779]: I0320 16:24:04.393770 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrc5n\" (UniqueName: \"kubernetes.io/projected/00295003-7d10-44ad-a4a9-f77ad233896f-kube-api-access-rrc5n\") pod \"00295003-7d10-44ad-a4a9-f77ad233896f\" (UID: \"00295003-7d10-44ad-a4a9-f77ad233896f\") " Mar 20 16:24:04 crc kubenswrapper[4779]: I0320 16:24:04.399643 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00295003-7d10-44ad-a4a9-f77ad233896f-kube-api-access-rrc5n" (OuterVolumeSpecName: "kube-api-access-rrc5n") pod "00295003-7d10-44ad-a4a9-f77ad233896f" (UID: "00295003-7d10-44ad-a4a9-f77ad233896f"). InnerVolumeSpecName "kube-api-access-rrc5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:04 crc kubenswrapper[4779]: I0320 16:24:04.497211 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrc5n\" (UniqueName: \"kubernetes.io/projected/00295003-7d10-44ad-a4a9-f77ad233896f-kube-api-access-rrc5n\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:04 crc kubenswrapper[4779]: I0320 16:24:04.909398 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-8775c" event={"ID":"00295003-7d10-44ad-a4a9-f77ad233896f","Type":"ContainerDied","Data":"ee6c0d79f09dc795199dc105f37eaff69a3400c7b767a3fea19f3849572acbc1"} Mar 20 16:24:04 crc kubenswrapper[4779]: I0320 16:24:04.909477 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee6c0d79f09dc795199dc105f37eaff69a3400c7b767a3fea19f3849572acbc1" Mar 20 16:24:04 crc kubenswrapper[4779]: I0320 16:24:04.909543 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-8775c" Mar 20 16:24:05 crc kubenswrapper[4779]: I0320 16:24:05.405489 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-8tgdn"] Mar 20 16:24:05 crc kubenswrapper[4779]: I0320 16:24:05.413978 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-8tgdn"] Mar 20 16:24:05 crc kubenswrapper[4779]: I0320 16:24:05.822073 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1dfcb0-6dfd-45f1-babd-e3cb75577774" path="/var/lib/kubelet/pods/ea1dfcb0-6dfd-45f1-babd-e3cb75577774/volumes" Mar 20 16:24:45 crc kubenswrapper[4779]: I0320 16:24:45.024087 4779 scope.go:117] "RemoveContainer" containerID="1b23a40afc236e0483a4ee53b47415d85ce4c33381d22bacbb71add04cfe2b3c" Mar 20 16:24:55 crc kubenswrapper[4779]: I0320 16:24:55.149917 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:24:55 crc kubenswrapper[4779]: I0320 16:24:55.150539 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:25:25 crc kubenswrapper[4779]: I0320 16:25:25.150066 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:25:25 crc kubenswrapper[4779]: I0320 16:25:25.152225 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.150387 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.150918 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.150954 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.151811 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.151870 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" gracePeriod=600 Mar 20 16:25:55 crc kubenswrapper[4779]: E0320 16:25:55.275662 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.988762 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" exitCode=0 Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.988807 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d"} Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.988845 4779 scope.go:117] "RemoveContainer" containerID="69f71fe86c460e6ddfed9470ce53b0c78d6258579872b569994718a9cc92ee8d" Mar 20 16:25:55 crc kubenswrapper[4779]: I0320 16:25:55.989612 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:25:55 crc kubenswrapper[4779]: E0320 16:25:55.990062 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.152862 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567066-hcrr8"] Mar 20 16:26:00 crc kubenswrapper[4779]: E0320 16:26:00.163878 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00295003-7d10-44ad-a4a9-f77ad233896f" containerName="oc" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.163903 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="00295003-7d10-44ad-a4a9-f77ad233896f" containerName="oc" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.164159 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="00295003-7d10-44ad-a4a9-f77ad233896f" containerName="oc" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.164811 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-hcrr8"] Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.164896 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-hcrr8" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.172075 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.172187 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.172448 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.269286 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5jm\" (UniqueName: \"kubernetes.io/projected/ab9d8c0b-d333-4ca9-a665-b50c361263ab-kube-api-access-lj5jm\") pod \"auto-csr-approver-29567066-hcrr8\" (UID: \"ab9d8c0b-d333-4ca9-a665-b50c361263ab\") " pod="openshift-infra/auto-csr-approver-29567066-hcrr8" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.372692 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5jm\" (UniqueName: \"kubernetes.io/projected/ab9d8c0b-d333-4ca9-a665-b50c361263ab-kube-api-access-lj5jm\") pod \"auto-csr-approver-29567066-hcrr8\" (UID: \"ab9d8c0b-d333-4ca9-a665-b50c361263ab\") " pod="openshift-infra/auto-csr-approver-29567066-hcrr8" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.398199 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5jm\" (UniqueName: \"kubernetes.io/projected/ab9d8c0b-d333-4ca9-a665-b50c361263ab-kube-api-access-lj5jm\") pod \"auto-csr-approver-29567066-hcrr8\" (UID: \"ab9d8c0b-d333-4ca9-a665-b50c361263ab\") " pod="openshift-infra/auto-csr-approver-29567066-hcrr8" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.484624 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-hcrr8" Mar 20 16:26:00 crc kubenswrapper[4779]: I0320 16:26:00.977073 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-hcrr8"] Mar 20 16:26:01 crc kubenswrapper[4779]: I0320 16:26:01.045390 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-hcrr8" event={"ID":"ab9d8c0b-d333-4ca9-a665-b50c361263ab","Type":"ContainerStarted","Data":"0802201dc473c9436329b289d04e1bc8c78dd04e43deed895fb4ee91656a7fed"} Mar 20 16:26:01 crc kubenswrapper[4779]: I0320 16:26:01.047270 4779 generic.go:334] "Generic (PLEG): container finished" podID="19f31371-34ad-444f-98e9-1cd99dbe6b24" containerID="5d4ba101a6482e04872e63355fa816358d04d94b9b3241bfd1a4371f899c4dad" exitCode=1 Mar 20 16:26:01 crc kubenswrapper[4779]: I0320 16:26:01.047301 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"19f31371-34ad-444f-98e9-1cd99dbe6b24","Type":"ContainerDied","Data":"5d4ba101a6482e04872e63355fa816358d04d94b9b3241bfd1a4371f899c4dad"} Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.402872 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.410656 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-config-data\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.411124 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.411211 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ca-certs\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.411259 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ssh-key\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.411602 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-config-data" (OuterVolumeSpecName: "config-data") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.412350 4779 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.457165 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.474530 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.474886 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.513291 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-workdir\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.513340 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config-secret\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.513377 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.513486 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6p8q\" (UniqueName: \"kubernetes.io/projected/19f31371-34ad-444f-98e9-1cd99dbe6b24-kube-api-access-n6p8q\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.513508 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-temporary\") pod \"19f31371-34ad-444f-98e9-1cd99dbe6b24\" (UID: \"19f31371-34ad-444f-98e9-1cd99dbe6b24\") " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.513776 4779 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.513791 4779 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.513801 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.514287 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.525741 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f31371-34ad-444f-98e9-1cd99dbe6b24-kube-api-access-n6p8q" (OuterVolumeSpecName: "kube-api-access-n6p8q") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "kube-api-access-n6p8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.526521 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.552835 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.586413 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "19f31371-34ad-444f-98e9-1cd99dbe6b24" (UID: "19f31371-34ad-444f-98e9-1cd99dbe6b24"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.615358 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6p8q\" (UniqueName: \"kubernetes.io/projected/19f31371-34ad-444f-98e9-1cd99dbe6b24-kube-api-access-n6p8q\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.615400 4779 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.615418 4779 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/19f31371-34ad-444f-98e9-1cd99dbe6b24-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.615429 4779 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19f31371-34ad-444f-98e9-1cd99dbe6b24-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.615465 4779 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.635133 4779 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 16:26:02 crc kubenswrapper[4779]: I0320 16:26:02.717939 4779 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:03 crc kubenswrapper[4779]: I0320 16:26:03.070556 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"19f31371-34ad-444f-98e9-1cd99dbe6b24","Type":"ContainerDied","Data":"09ba77b763ee18b7dfaeaa5bb088ddd34639878195494a2002ab0d4119961ee7"} Mar 20 16:26:03 crc kubenswrapper[4779]: I0320 16:26:03.070731 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09ba77b763ee18b7dfaeaa5bb088ddd34639878195494a2002ab0d4119961ee7" Mar 20 16:26:03 crc kubenswrapper[4779]: I0320 16:26:03.070616 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 16:26:03 crc kubenswrapper[4779]: I0320 16:26:03.073040 4779 generic.go:334] "Generic (PLEG): container finished" podID="ab9d8c0b-d333-4ca9-a665-b50c361263ab" containerID="c1de01b0b87f5a208b53f1fc764b952ef0fefc9ba06a9c47d898a6d2b9284567" exitCode=0 Mar 20 16:26:03 crc kubenswrapper[4779]: I0320 16:26:03.073319 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-hcrr8" event={"ID":"ab9d8c0b-d333-4ca9-a665-b50c361263ab","Type":"ContainerDied","Data":"c1de01b0b87f5a208b53f1fc764b952ef0fefc9ba06a9c47d898a6d2b9284567"} Mar 20 16:26:04 crc kubenswrapper[4779]: I0320 16:26:04.399197 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-hcrr8" Mar 20 16:26:04 crc kubenswrapper[4779]: I0320 16:26:04.552588 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj5jm\" (UniqueName: \"kubernetes.io/projected/ab9d8c0b-d333-4ca9-a665-b50c361263ab-kube-api-access-lj5jm\") pod \"ab9d8c0b-d333-4ca9-a665-b50c361263ab\" (UID: \"ab9d8c0b-d333-4ca9-a665-b50c361263ab\") " Mar 20 16:26:04 crc kubenswrapper[4779]: I0320 16:26:04.559955 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9d8c0b-d333-4ca9-a665-b50c361263ab-kube-api-access-lj5jm" (OuterVolumeSpecName: "kube-api-access-lj5jm") pod "ab9d8c0b-d333-4ca9-a665-b50c361263ab" (UID: "ab9d8c0b-d333-4ca9-a665-b50c361263ab"). InnerVolumeSpecName "kube-api-access-lj5jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:26:04 crc kubenswrapper[4779]: I0320 16:26:04.655696 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj5jm\" (UniqueName: \"kubernetes.io/projected/ab9d8c0b-d333-4ca9-a665-b50c361263ab-kube-api-access-lj5jm\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:05 crc kubenswrapper[4779]: I0320 16:26:05.090785 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-hcrr8" event={"ID":"ab9d8c0b-d333-4ca9-a665-b50c361263ab","Type":"ContainerDied","Data":"0802201dc473c9436329b289d04e1bc8c78dd04e43deed895fb4ee91656a7fed"} Mar 20 16:26:05 crc kubenswrapper[4779]: I0320 16:26:05.090837 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0802201dc473c9436329b289d04e1bc8c78dd04e43deed895fb4ee91656a7fed" Mar 20 16:26:05 crc kubenswrapper[4779]: I0320 16:26:05.090967 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-hcrr8" Mar 20 16:26:05 crc kubenswrapper[4779]: I0320 16:26:05.468867 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-plhzb"] Mar 20 16:26:05 crc kubenswrapper[4779]: I0320 16:26:05.478147 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-plhzb"] Mar 20 16:26:05 crc kubenswrapper[4779]: I0320 16:26:05.820544 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7bca85e-0c5f-40fa-9d27-291671d6ca7f" path="/var/lib/kubelet/pods/f7bca85e-0c5f-40fa-9d27-291671d6ca7f/volumes" Mar 20 16:26:09 crc kubenswrapper[4779]: I0320 16:26:09.808974 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:26:09 crc kubenswrapper[4779]: E0320 16:26:09.809726 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:26:12 crc kubenswrapper[4779]: I0320 16:26:12.855286 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 16:26:12 crc kubenswrapper[4779]: E0320 16:26:12.856237 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9d8c0b-d333-4ca9-a665-b50c361263ab" containerName="oc" Mar 20 16:26:12 crc kubenswrapper[4779]: I0320 16:26:12.856250 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9d8c0b-d333-4ca9-a665-b50c361263ab" containerName="oc" Mar 20 16:26:12 crc kubenswrapper[4779]: E0320 16:26:12.856267 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f31371-34ad-444f-98e9-1cd99dbe6b24" containerName="tempest-tests-tempest-tests-runner" Mar 20 16:26:12 crc kubenswrapper[4779]: I0320 16:26:12.856275 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f31371-34ad-444f-98e9-1cd99dbe6b24" containerName="tempest-tests-tempest-tests-runner" Mar 20 16:26:12 crc kubenswrapper[4779]: I0320 16:26:12.856509 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f31371-34ad-444f-98e9-1cd99dbe6b24" containerName="tempest-tests-tempest-tests-runner" Mar 20 16:26:12 crc kubenswrapper[4779]: I0320 16:26:12.856538 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9d8c0b-d333-4ca9-a665-b50c361263ab" containerName="oc" Mar 20 16:26:12 crc kubenswrapper[4779]: I0320 16:26:12.857222 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:12 crc kubenswrapper[4779]: I0320 16:26:12.859177 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tkd2f" Mar 20 16:26:12 crc kubenswrapper[4779]: I0320 16:26:12.865439 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.019803 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwfd\" (UniqueName: \"kubernetes.io/projected/4ca8f9c7-d49d-4a13-8b2d-99c06523ce13-kube-api-access-bgwfd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.019938 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.121724 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwfd\" (UniqueName: \"kubernetes.io/projected/4ca8f9c7-d49d-4a13-8b2d-99c06523ce13-kube-api-access-bgwfd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.121796 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.122360 4779 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.140100 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwfd\" (UniqueName: \"kubernetes.io/projected/4ca8f9c7-d49d-4a13-8b2d-99c06523ce13-kube-api-access-bgwfd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.148762 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.182373 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 16:26:13 crc kubenswrapper[4779]: I0320 16:26:13.662496 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 16:26:14 crc kubenswrapper[4779]: I0320 16:26:14.164987 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13","Type":"ContainerStarted","Data":"c9c411598e8b4df6083e4cd20f6891362bf124276489567f01111be8c8bf88aa"} Mar 20 16:26:15 crc kubenswrapper[4779]: I0320 16:26:15.175058 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4ca8f9c7-d49d-4a13-8b2d-99c06523ce13","Type":"ContainerStarted","Data":"0760c8d20ecea09f702c0bd7f23492250a13ed3ec3491082e65ad26d7c828bc2"} Mar 20 16:26:15 crc kubenswrapper[4779]: I0320 16:26:15.189743 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.253763946 podStartE2EDuration="3.189727943s" podCreationTimestamp="2026-03-20 16:26:12 +0000 UTC" firstStartedPulling="2026-03-20 16:26:13.663430485 +0000 UTC m=+3790.625946315" lastFinishedPulling="2026-03-20 16:26:14.599394512 +0000 UTC m=+3791.561910312" observedRunningTime="2026-03-20 16:26:15.188475302 +0000 UTC m=+3792.150991102" watchObservedRunningTime="2026-03-20 16:26:15.189727943 +0000 UTC m=+3792.152243743" Mar 20 16:26:22 crc kubenswrapper[4779]: I0320 16:26:22.808860 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:26:22 crc kubenswrapper[4779]: E0320 16:26:22.809672 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:26:34 crc kubenswrapper[4779]: I0320 16:26:34.809617 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:26:34 crc kubenswrapper[4779]: E0320 16:26:34.810933 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:26:45 crc kubenswrapper[4779]: I0320 16:26:45.120766 4779 scope.go:117] "RemoveContainer" containerID="fae44409707f43a0ddb78f7126a653009e991b10ad77afa4c9754fc964db7ac5" Mar 20 16:26:45 crc kubenswrapper[4779]: I0320 16:26:45.811577 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:26:45 crc kubenswrapper[4779]: E0320 16:26:45.812164 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:26:56 crc kubenswrapper[4779]: I0320 16:26:56.809755 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:26:56 crc kubenswrapper[4779]: E0320 16:26:56.810534 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.375536 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7ggbl/must-gather-2bwvn"] Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.377843 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.379774 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7ggbl"/"kube-root-ca.crt" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.379987 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7ggbl"/"openshift-service-ca.crt" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.385355 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7ggbl"/"default-dockercfg-svht8" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.399414 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ggbl/must-gather-2bwvn"] Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.543633 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-must-gather-output\") pod \"must-gather-2bwvn\" (UID: \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\") " pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.543753 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr98k\" (UniqueName: \"kubernetes.io/projected/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-kube-api-access-sr98k\") pod \"must-gather-2bwvn\" (UID: \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\") " pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.646978 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-must-gather-output\") pod \"must-gather-2bwvn\" (UID: \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\") " pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.647162 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr98k\" (UniqueName: \"kubernetes.io/projected/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-kube-api-access-sr98k\") pod \"must-gather-2bwvn\" (UID: \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\") " pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.647541 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-must-gather-output\") pod \"must-gather-2bwvn\" (UID: \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\") " pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.669929 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr98k\" (UniqueName: \"kubernetes.io/projected/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-kube-api-access-sr98k\") pod \"must-gather-2bwvn\" (UID: \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\") " pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:27:06 crc kubenswrapper[4779]: I0320 16:27:06.694690 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:27:07 crc kubenswrapper[4779]: I0320 16:27:07.187660 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ggbl/must-gather-2bwvn"] Mar 20 16:27:07 crc kubenswrapper[4779]: I0320 16:27:07.189274 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:27:07 crc kubenswrapper[4779]: I0320 16:27:07.681718 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" event={"ID":"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c","Type":"ContainerStarted","Data":"85b04b8974c12ed77ce842f724fa34fb5cf2a99cfdb11517e7ffa0671742b9d7"} Mar 20 16:27:09 crc kubenswrapper[4779]: I0320 16:27:09.809234 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:27:09 crc kubenswrapper[4779]: E0320 16:27:09.809758 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:27:11 crc kubenswrapper[4779]: I0320 16:27:11.734065 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" event={"ID":"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c","Type":"ContainerStarted","Data":"737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06"} Mar 20 16:27:11 crc kubenswrapper[4779]: I0320 16:27:11.734725 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" event={"ID":"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c","Type":"ContainerStarted","Data":"fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615"} Mar 20 16:27:11 crc kubenswrapper[4779]: I0320 16:27:11.757216 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" podStartSLOduration=2.202368759 podStartE2EDuration="5.757193886s" podCreationTimestamp="2026-03-20 16:27:06 +0000 UTC" firstStartedPulling="2026-03-20 16:27:07.18907394 +0000 UTC m=+3844.151589730" lastFinishedPulling="2026-03-20 16:27:10.743899057 +0000 UTC m=+3847.706414857" observedRunningTime="2026-03-20 16:27:11.747315646 +0000 UTC m=+3848.709831476" watchObservedRunningTime="2026-03-20 16:27:11.757193886 +0000 UTC m=+3848.719709686" Mar 20 16:27:14 crc kubenswrapper[4779]: I0320 16:27:14.801321 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-ghgkz"] Mar 20 16:27:14 crc kubenswrapper[4779]: I0320 16:27:14.803668 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:27:14 crc kubenswrapper[4779]: I0320 16:27:14.924487 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k69vp\" (UniqueName: \"kubernetes.io/projected/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-kube-api-access-k69vp\") pod \"crc-debug-ghgkz\" (UID: \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\") " pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:27:14 crc kubenswrapper[4779]: I0320 16:27:14.924637 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-host\") pod \"crc-debug-ghgkz\" (UID: \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\") " pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:27:15 crc kubenswrapper[4779]: I0320 16:27:15.026816 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k69vp\" (UniqueName: \"kubernetes.io/projected/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-kube-api-access-k69vp\") pod \"crc-debug-ghgkz\" (UID: \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\") " pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:27:15 crc kubenswrapper[4779]: I0320 16:27:15.027061 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-host\") pod \"crc-debug-ghgkz\" (UID: \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\") " pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:27:15 crc kubenswrapper[4779]: I0320 16:27:15.027299 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-host\") pod \"crc-debug-ghgkz\" (UID: \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\") " pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:27:15 crc kubenswrapper[4779]: I0320 16:27:15.049094 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k69vp\" (UniqueName: \"kubernetes.io/projected/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-kube-api-access-k69vp\") pod \"crc-debug-ghgkz\" (UID: \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\") " pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:27:15 crc kubenswrapper[4779]: I0320 16:27:15.127345 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:27:15 crc kubenswrapper[4779]: I0320 16:27:15.765072 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" event={"ID":"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11","Type":"ContainerStarted","Data":"257ffc590720b9ad9c84f55a5dfa71d0c42c17e04008aa91d928eaf73861e033"} Mar 20 16:27:24 crc kubenswrapper[4779]: I0320 16:27:24.152075 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:27:24 crc kubenswrapper[4779]: E0320 16:27:24.159379 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:27:26 crc kubenswrapper[4779]: I0320 16:27:26.248387 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" event={"ID":"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11","Type":"ContainerStarted","Data":"dc2c481eedb0e438cd366e28499039e3d402d9a3291b8b26359084f35e94ce59"} Mar 20 16:27:26 crc kubenswrapper[4779]: I0320 16:27:26.277885 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" podStartSLOduration=1.6821353669999999 podStartE2EDuration="12.277866894s" podCreationTimestamp="2026-03-20 16:27:14 +0000 UTC" firstStartedPulling="2026-03-20 16:27:15.164262714 +0000 UTC m=+3852.126778514" lastFinishedPulling="2026-03-20 16:27:25.759994241 +0000 UTC m=+3862.722510041" observedRunningTime="2026-03-20 16:27:26.26747622 +0000 UTC m=+3863.229992020" watchObservedRunningTime="2026-03-20 16:27:26.277866894 +0000 UTC m=+3863.240382684" Mar 20 16:27:39 crc kubenswrapper[4779]: I0320 16:27:39.809187 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:27:39 crc kubenswrapper[4779]: E0320 16:27:39.810343 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:27:52 crc kubenswrapper[4779]: I0320 16:27:52.808432 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:27:52 crc kubenswrapper[4779]: E0320 16:27:52.809305 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.160959 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567068-ztzgc"] Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.163102 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.170753 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.171144 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.171365 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.174189 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-ztzgc"] Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.292698 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756sr\" (UniqueName: \"kubernetes.io/projected/4338ac48-bf07-407c-a6fa-2826b7ff51cf-kube-api-access-756sr\") pod \"auto-csr-approver-29567068-ztzgc\" (UID: \"4338ac48-bf07-407c-a6fa-2826b7ff51cf\") " pod="openshift-infra/auto-csr-approver-29567068-ztzgc" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.394855 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756sr\" (UniqueName: \"kubernetes.io/projected/4338ac48-bf07-407c-a6fa-2826b7ff51cf-kube-api-access-756sr\") pod \"auto-csr-approver-29567068-ztzgc\" (UID: \"4338ac48-bf07-407c-a6fa-2826b7ff51cf\") " pod="openshift-infra/auto-csr-approver-29567068-ztzgc" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.413152 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756sr\" (UniqueName: \"kubernetes.io/projected/4338ac48-bf07-407c-a6fa-2826b7ff51cf-kube-api-access-756sr\") pod \"auto-csr-approver-29567068-ztzgc\" (UID: \"4338ac48-bf07-407c-a6fa-2826b7ff51cf\") " pod="openshift-infra/auto-csr-approver-29567068-ztzgc" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.502032 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" Mar 20 16:28:00 crc kubenswrapper[4779]: I0320 16:28:00.831853 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-ztzgc"] Mar 20 16:28:01 crc kubenswrapper[4779]: I0320 16:28:01.560246 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" event={"ID":"4338ac48-bf07-407c-a6fa-2826b7ff51cf","Type":"ContainerStarted","Data":"27e57abdbf1846d5437d4db9c44cf6b7acea0168cb341836a0563330b303da41"} Mar 20 16:28:02 crc kubenswrapper[4779]: I0320 16:28:02.570318 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" event={"ID":"4338ac48-bf07-407c-a6fa-2826b7ff51cf","Type":"ContainerStarted","Data":"86f451a0e6d9046e96eb3a71046429c635c0b90e42f88e777c9cf6383ce1fb0c"} Mar 20 16:28:02 crc kubenswrapper[4779]: I0320 16:28:02.588520 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" podStartSLOduration=1.214575852 podStartE2EDuration="2.588501693s" podCreationTimestamp="2026-03-20 16:28:00 +0000 UTC" firstStartedPulling="2026-03-20 16:28:00.876740944 +0000 UTC m=+3897.839256744" lastFinishedPulling="2026-03-20 16:28:02.250666785 +0000 UTC m=+3899.213182585" observedRunningTime="2026-03-20 16:28:02.586374941 +0000 UTC m=+3899.548890741" watchObservedRunningTime="2026-03-20 16:28:02.588501693 +0000 UTC m=+3899.551017493" Mar 20 16:28:03 crc kubenswrapper[4779]: I0320 16:28:03.581384 4779 generic.go:334] "Generic (PLEG): container finished" podID="4338ac48-bf07-407c-a6fa-2826b7ff51cf" containerID="86f451a0e6d9046e96eb3a71046429c635c0b90e42f88e777c9cf6383ce1fb0c" exitCode=0 Mar 20 16:28:03 crc kubenswrapper[4779]: I0320 16:28:03.581471 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" event={"ID":"4338ac48-bf07-407c-a6fa-2826b7ff51cf","Type":"ContainerDied","Data":"86f451a0e6d9046e96eb3a71046429c635c0b90e42f88e777c9cf6383ce1fb0c"} Mar 20 16:28:04 crc kubenswrapper[4779]: I0320 16:28:04.809887 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:28:04 crc kubenswrapper[4779]: E0320 16:28:04.810469 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:28:04 crc kubenswrapper[4779]: I0320 16:28:04.978284 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.102514 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756sr\" (UniqueName: \"kubernetes.io/projected/4338ac48-bf07-407c-a6fa-2826b7ff51cf-kube-api-access-756sr\") pod \"4338ac48-bf07-407c-a6fa-2826b7ff51cf\" (UID: \"4338ac48-bf07-407c-a6fa-2826b7ff51cf\") " Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.121296 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4338ac48-bf07-407c-a6fa-2826b7ff51cf-kube-api-access-756sr" (OuterVolumeSpecName: "kube-api-access-756sr") pod "4338ac48-bf07-407c-a6fa-2826b7ff51cf" (UID: "4338ac48-bf07-407c-a6fa-2826b7ff51cf"). InnerVolumeSpecName "kube-api-access-756sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.204642 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756sr\" (UniqueName: \"kubernetes.io/projected/4338ac48-bf07-407c-a6fa-2826b7ff51cf-kube-api-access-756sr\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.602015 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" event={"ID":"4338ac48-bf07-407c-a6fa-2826b7ff51cf","Type":"ContainerDied","Data":"27e57abdbf1846d5437d4db9c44cf6b7acea0168cb341836a0563330b303da41"} Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.602292 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e57abdbf1846d5437d4db9c44cf6b7acea0168cb341836a0563330b303da41" Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.602341 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-ztzgc" Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.661911 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-mcj2h"] Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.670213 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-mcj2h"] Mar 20 16:28:05 crc kubenswrapper[4779]: I0320 16:28:05.821013 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b681c5dc-c089-4921-b334-1817f709ca6f" path="/var/lib/kubelet/pods/b681c5dc-c089-4921-b334-1817f709ca6f/volumes" Mar 20 16:28:07 crc kubenswrapper[4779]: I0320 16:28:07.620919 4779 generic.go:334] "Generic (PLEG): container finished" podID="8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11" containerID="dc2c481eedb0e438cd366e28499039e3d402d9a3291b8b26359084f35e94ce59" exitCode=0 Mar 20 16:28:07 crc kubenswrapper[4779]: I0320 16:28:07.621889 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" event={"ID":"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11","Type":"ContainerDied","Data":"dc2c481eedb0e438cd366e28499039e3d402d9a3291b8b26359084f35e94ce59"} Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.751540 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.789701 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-ghgkz"] Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.800790 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-ghgkz"] Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.887760 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k69vp\" (UniqueName: \"kubernetes.io/projected/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-kube-api-access-k69vp\") pod \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\" (UID: \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\") " Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.887918 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-host\") pod \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\" (UID: \"8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11\") " Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.888035 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-host" (OuterVolumeSpecName: "host") pod "8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11" (UID: "8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.888430 4779 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-host\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.894059 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-kube-api-access-k69vp" (OuterVolumeSpecName: "kube-api-access-k69vp") pod "8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11" (UID: "8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11"). InnerVolumeSpecName "kube-api-access-k69vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:28:08 crc kubenswrapper[4779]: I0320 16:28:08.990340 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k69vp\" (UniqueName: \"kubernetes.io/projected/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11-kube-api-access-k69vp\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.639057 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="257ffc590720b9ad9c84f55a5dfa71d0c42c17e04008aa91d928eaf73861e033" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.639123 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-ghgkz" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.819295 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11" path="/var/lib/kubelet/pods/8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11/volumes" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.966244 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-qzthr"] Mar 20 16:28:09 crc kubenswrapper[4779]: E0320 16:28:09.966622 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4338ac48-bf07-407c-a6fa-2826b7ff51cf" containerName="oc" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.966642 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="4338ac48-bf07-407c-a6fa-2826b7ff51cf" containerName="oc" Mar 20 16:28:09 crc kubenswrapper[4779]: E0320 16:28:09.966661 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11" containerName="container-00" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.966667 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11" containerName="container-00" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.966854 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a73d031-8c9d-49ea-ab0f-fe7ce39a2f11" containerName="container-00" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.966872 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="4338ac48-bf07-407c-a6fa-2826b7ff51cf" containerName="oc" Mar 20 16:28:09 crc kubenswrapper[4779]: I0320 16:28:09.967489 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.112791 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrdl\" (UniqueName: \"kubernetes.io/projected/57f8a675-80b0-450c-9aa2-8735b9e5520f-kube-api-access-7zrdl\") pod \"crc-debug-qzthr\" (UID: \"57f8a675-80b0-450c-9aa2-8735b9e5520f\") " pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.112970 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f8a675-80b0-450c-9aa2-8735b9e5520f-host\") pod \"crc-debug-qzthr\" (UID: \"57f8a675-80b0-450c-9aa2-8735b9e5520f\") " pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.215350 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f8a675-80b0-450c-9aa2-8735b9e5520f-host\") pod \"crc-debug-qzthr\" (UID: \"57f8a675-80b0-450c-9aa2-8735b9e5520f\") " pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.215454 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f8a675-80b0-450c-9aa2-8735b9e5520f-host\") pod \"crc-debug-qzthr\" (UID: \"57f8a675-80b0-450c-9aa2-8735b9e5520f\") " pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.215789 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrdl\" (UniqueName: \"kubernetes.io/projected/57f8a675-80b0-450c-9aa2-8735b9e5520f-kube-api-access-7zrdl\") pod \"crc-debug-qzthr\" (UID: \"57f8a675-80b0-450c-9aa2-8735b9e5520f\") " pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.235957 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrdl\" (UniqueName: \"kubernetes.io/projected/57f8a675-80b0-450c-9aa2-8735b9e5520f-kube-api-access-7zrdl\") pod \"crc-debug-qzthr\" (UID: \"57f8a675-80b0-450c-9aa2-8735b9e5520f\") " pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.286926 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.649068 4779 generic.go:334] "Generic (PLEG): container finished" podID="57f8a675-80b0-450c-9aa2-8735b9e5520f" containerID="769d57dbe47869b954e0b703a8f455b32f521cee5fdb0fc1a802ec6747db5bb3" exitCode=0 Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.649136 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/crc-debug-qzthr" event={"ID":"57f8a675-80b0-450c-9aa2-8735b9e5520f","Type":"ContainerDied","Data":"769d57dbe47869b954e0b703a8f455b32f521cee5fdb0fc1a802ec6747db5bb3"} Mar 20 16:28:10 crc kubenswrapper[4779]: I0320 16:28:10.649194 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/crc-debug-qzthr" event={"ID":"57f8a675-80b0-450c-9aa2-8735b9e5520f","Type":"ContainerStarted","Data":"f2f1bb35681be631a0228bd5958da612d8335139bebb23dd334745eb616b3c2a"} Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.100532 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-qzthr"] Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.117651 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-qzthr"] Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.771494 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.846538 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zrdl\" (UniqueName: \"kubernetes.io/projected/57f8a675-80b0-450c-9aa2-8735b9e5520f-kube-api-access-7zrdl\") pod \"57f8a675-80b0-450c-9aa2-8735b9e5520f\" (UID: \"57f8a675-80b0-450c-9aa2-8735b9e5520f\") " Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.846596 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f8a675-80b0-450c-9aa2-8735b9e5520f-host\") pod \"57f8a675-80b0-450c-9aa2-8735b9e5520f\" (UID: \"57f8a675-80b0-450c-9aa2-8735b9e5520f\") " Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.846753 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57f8a675-80b0-450c-9aa2-8735b9e5520f-host" (OuterVolumeSpecName: "host") pod "57f8a675-80b0-450c-9aa2-8735b9e5520f" (UID: "57f8a675-80b0-450c-9aa2-8735b9e5520f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.847266 4779 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f8a675-80b0-450c-9aa2-8735b9e5520f-host\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.852297 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f8a675-80b0-450c-9aa2-8735b9e5520f-kube-api-access-7zrdl" (OuterVolumeSpecName: "kube-api-access-7zrdl") pod "57f8a675-80b0-450c-9aa2-8735b9e5520f" (UID: "57f8a675-80b0-450c-9aa2-8735b9e5520f"). InnerVolumeSpecName "kube-api-access-7zrdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:28:11 crc kubenswrapper[4779]: I0320 16:28:11.949328 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zrdl\" (UniqueName: \"kubernetes.io/projected/57f8a675-80b0-450c-9aa2-8735b9e5520f-kube-api-access-7zrdl\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.328719 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-rtk6f"] Mar 20 16:28:12 crc kubenswrapper[4779]: E0320 16:28:12.331157 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f8a675-80b0-450c-9aa2-8735b9e5520f" containerName="container-00" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.331249 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f8a675-80b0-450c-9aa2-8735b9e5520f" containerName="container-00" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.331744 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f8a675-80b0-450c-9aa2-8735b9e5520f" containerName="container-00" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.334199 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.457885 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-host\") pod \"crc-debug-rtk6f\" (UID: \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\") " pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.458642 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9nn\" (UniqueName: \"kubernetes.io/projected/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-kube-api-access-ss9nn\") pod \"crc-debug-rtk6f\" (UID: \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\") " pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.560104 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-host\") pod \"crc-debug-rtk6f\" (UID: \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\") " pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.560201 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-host\") pod \"crc-debug-rtk6f\" (UID: \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\") " pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.560232 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9nn\" (UniqueName: \"kubernetes.io/projected/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-kube-api-access-ss9nn\") pod \"crc-debug-rtk6f\" (UID: \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\") " pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.582265 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9nn\" (UniqueName: \"kubernetes.io/projected/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-kube-api-access-ss9nn\") pod \"crc-debug-rtk6f\" (UID: \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\") " pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.656299 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.667833 4779 scope.go:117] "RemoveContainer" containerID="769d57dbe47869b954e0b703a8f455b32f521cee5fdb0fc1a802ec6747db5bb3" Mar 20 16:28:12 crc kubenswrapper[4779]: I0320 16:28:12.667872 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-qzthr" Mar 20 16:28:12 crc kubenswrapper[4779]: W0320 16:28:12.697045 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c157641_dc0d_4c52_a2b7_5bc0236ed1b4.slice/crio-f2cbd24b795eda48f05869fb0923274f551d23ea7f3ce35e125933ddcb7a400b WatchSource:0}: Error finding container f2cbd24b795eda48f05869fb0923274f551d23ea7f3ce35e125933ddcb7a400b: Status 404 returned error can't find the container with id f2cbd24b795eda48f05869fb0923274f551d23ea7f3ce35e125933ddcb7a400b Mar 20 16:28:13 crc kubenswrapper[4779]: I0320 16:28:13.680249 4779 generic.go:334] "Generic (PLEG): container finished" podID="0c157641-dc0d-4c52-a2b7-5bc0236ed1b4" containerID="2e26ca35cae350efe308dc967f674ad74ce495f3d72ea897702ab3fe61534cba" exitCode=0 Mar 20 16:28:13 crc kubenswrapper[4779]: I0320 16:28:13.680558 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" event={"ID":"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4","Type":"ContainerDied","Data":"2e26ca35cae350efe308dc967f674ad74ce495f3d72ea897702ab3fe61534cba"} Mar 20 16:28:13 crc kubenswrapper[4779]: I0320 16:28:13.680588 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" event={"ID":"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4","Type":"ContainerStarted","Data":"f2cbd24b795eda48f05869fb0923274f551d23ea7f3ce35e125933ddcb7a400b"} Mar 20 16:28:13 crc kubenswrapper[4779]: I0320 16:28:13.725781 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-rtk6f"] Mar 20 16:28:13 crc kubenswrapper[4779]: I0320 16:28:13.741186 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7ggbl/crc-debug-rtk6f"] Mar 20 16:28:13 crc kubenswrapper[4779]: I0320 16:28:13.818835 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f8a675-80b0-450c-9aa2-8735b9e5520f" path="/var/lib/kubelet/pods/57f8a675-80b0-450c-9aa2-8735b9e5520f/volumes" Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.389382 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.476673 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss9nn\" (UniqueName: \"kubernetes.io/projected/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-kube-api-access-ss9nn\") pod \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\" (UID: \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\") " Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.476805 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-host\") pod \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\" (UID: \"0c157641-dc0d-4c52-a2b7-5bc0236ed1b4\") " Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.477533 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-host" (OuterVolumeSpecName: "host") pod "0c157641-dc0d-4c52-a2b7-5bc0236ed1b4" (UID: "0c157641-dc0d-4c52-a2b7-5bc0236ed1b4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.481760 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-kube-api-access-ss9nn" (OuterVolumeSpecName: "kube-api-access-ss9nn") pod "0c157641-dc0d-4c52-a2b7-5bc0236ed1b4" (UID: "0c157641-dc0d-4c52-a2b7-5bc0236ed1b4"). InnerVolumeSpecName "kube-api-access-ss9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.579891 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss9nn\" (UniqueName: \"kubernetes.io/projected/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-kube-api-access-ss9nn\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.579927 4779 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4-host\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.780494 4779 scope.go:117] "RemoveContainer" containerID="2e26ca35cae350efe308dc967f674ad74ce495f3d72ea897702ab3fe61534cba" Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.780536 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/crc-debug-rtk6f" Mar 20 16:28:15 crc kubenswrapper[4779]: I0320 16:28:15.837939 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c157641-dc0d-4c52-a2b7-5bc0236ed1b4" path="/var/lib/kubelet/pods/0c157641-dc0d-4c52-a2b7-5bc0236ed1b4/volumes" Mar 20 16:28:18 crc kubenswrapper[4779]: I0320 16:28:18.809124 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:28:18 crc kubenswrapper[4779]: E0320 16:28:18.809866 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:28:29 crc kubenswrapper[4779]: I0320 16:28:29.809619 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:28:29 crc kubenswrapper[4779]: E0320 16:28:29.810390 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:28:41 crc kubenswrapper[4779]: I0320 16:28:41.809049 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:28:41 crc kubenswrapper[4779]: E0320 16:28:41.809938 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:28:45 crc kubenswrapper[4779]: I0320 16:28:45.213766 4779 scope.go:117] "RemoveContainer" containerID="390821a777ce4156d0e9577380d9870d306381cecad54f94f96d7332023a12d5" Mar 20 16:28:49 crc kubenswrapper[4779]: I0320 16:28:49.485122 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5bfc6968f8-nhbpx_c0cbf17b-b63c-429d-8f23-694e5c0d566a/barbican-api/0.log" Mar 20 16:28:49 crc kubenswrapper[4779]: I0320 16:28:49.672585 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5bfc6968f8-nhbpx_c0cbf17b-b63c-429d-8f23-694e5c0d566a/barbican-api-log/0.log" Mar 20 16:28:49 crc kubenswrapper[4779]: I0320 16:28:49.784908 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-557b5d954b-f5w4x_ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294/barbican-keystone-listener/0.log" Mar 20 16:28:49 crc kubenswrapper[4779]: I0320 16:28:49.825709 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-557b5d954b-f5w4x_ed8a54dd-3146-4a2c-a0b9-ee9cbd3b1294/barbican-keystone-listener-log/0.log" Mar 20 16:28:49 crc kubenswrapper[4779]: I0320 16:28:49.893068 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57f6557d6c-cps7z_0d481333-1c1f-49eb-a304-cfb5861d2cf4/barbican-worker/0.log" Mar 20 16:28:49 crc kubenswrapper[4779]: I0320 16:28:49.991660 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57f6557d6c-cps7z_0d481333-1c1f-49eb-a304-cfb5861d2cf4/barbican-worker-log/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.278048 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6d5528f-e060-40ac-91c7-ac53aef84cb5/ceilometer-central-agent/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.328449 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7kqb2_b4aee649-af38-4064-ba29-bf2837b4c652/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.384909 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6d5528f-e060-40ac-91c7-ac53aef84cb5/ceilometer-notification-agent/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.448924 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6d5528f-e060-40ac-91c7-ac53aef84cb5/proxy-httpd/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.472302 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6d5528f-e060-40ac-91c7-ac53aef84cb5/sg-core/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.630261 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ee20d921-f263-4089-b0c5-06ebfed15478/cinder-api/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.675279 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ee20d921-f263-4089-b0c5-06ebfed15478/cinder-api-log/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.917316 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7409303d-4c67-4f93-b2c8-c6be3336df88/cinder-scheduler/0.log" Mar 20 16:28:50 crc kubenswrapper[4779]: I0320 16:28:50.959244 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7409303d-4c67-4f93-b2c8-c6be3336df88/probe/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.157693 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-76p96_0fa051fc-8b6e-4d4b-b5f3-2a4660aafaeb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.381067 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44qdw_b3c10bd0-d7e5-42cb-8609-4a7692300f37/init/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.400467 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6ckgf_5181a11a-08d8-4440-acc3-c8e6c65edded/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.552681 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44qdw_b3c10bd0-d7e5-42cb-8609-4a7692300f37/init/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.598325 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44qdw_b3c10bd0-d7e5-42cb-8609-4a7692300f37/dnsmasq-dns/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.749332 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nkttv_14a567a6-af89-4bcb-84ae-630e3546007f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.778871 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ba2590d6-15fb-4232-9bed-dc6a44e56241/glance-httpd/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.832633 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ba2590d6-15fb-4232-9bed-dc6a44e56241/glance-log/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.979133 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b801f31f-e923-4529-9d41-299903b3d167/glance-log/0.log" Mar 20 16:28:51 crc kubenswrapper[4779]: I0320 16:28:51.988052 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b801f31f-e923-4529-9d41-299903b3d167/glance-httpd/0.log" Mar 20 16:28:52 crc kubenswrapper[4779]: I0320 16:28:52.320331 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc78779bd-shkrw_dddea4aa-aba6-49f1-a8bc-6ce9850da26d/horizon/0.log" Mar 20 16:28:52 crc kubenswrapper[4779]: I0320 16:28:52.390848 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qhsgs_3f3c88f1-f136-4908-bf87-9a114dc67cf7/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:52 crc kubenswrapper[4779]: I0320 16:28:52.614885 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc78779bd-shkrw_dddea4aa-aba6-49f1-a8bc-6ce9850da26d/horizon-log/0.log" Mar 20 16:28:52 crc kubenswrapper[4779]: I0320 16:28:52.916381 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567041-mqqts_1d9fcb2d-01f5-4d3b-8586-b0b700e68755/keystone-cron/0.log" Mar 20 16:28:53 crc kubenswrapper[4779]: I0320 16:28:53.028365 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6878c9db6b-r5g2t_20f41ef6-17fa-4c90-a902-d2f5efdf45f5/keystone-api/0.log" Mar 20 16:28:53 crc kubenswrapper[4779]: I0320 16:28:53.083916 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6a048aaf-5953-4a6d-8aa3-631fe9ea027b/kube-state-metrics/0.log" Mar 20 16:28:53 crc kubenswrapper[4779]: I0320 16:28:53.092211 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-p6d88_b412995c-cebc-47ec-8ba2-9644a0f65c18/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:53 crc kubenswrapper[4779]: I0320 16:28:53.705992 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-854c6767b7-nlddt_b8707035-1583-426d-aa57-b5a23e7403b3/neutron-api/0.log" Mar 20 16:28:53 crc kubenswrapper[4779]: I0320 16:28:53.762954 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-854c6767b7-nlddt_b8707035-1583-426d-aa57-b5a23e7403b3/neutron-httpd/0.log" Mar 20 16:28:53 crc kubenswrapper[4779]: I0320 16:28:53.814415 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:28:53 crc kubenswrapper[4779]: E0320 16:28:53.814868 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:28:54 crc kubenswrapper[4779]: I0320 16:28:54.097840 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnrwt_9bbeb643-62cf-4487-b133-c1c618fe49d7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:54 crc kubenswrapper[4779]: I0320 16:28:54.584951 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-44j5q_6e8b60eb-d9f8-4956-91eb-9d0b760f4df1/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:54 crc kubenswrapper[4779]: I0320 16:28:54.643313 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_875afdcb-bded-47ea-98e0-2acabc9441ee/nova-cell0-conductor-conductor/0.log" Mar 20 16:28:54 crc kubenswrapper[4779]: I0320 16:28:54.719591 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_65cc86a8-23ad-428c-a20a-e932e6cfebd1/nova-api-log/0.log" Mar 20 16:28:54 crc kubenswrapper[4779]: I0320 16:28:54.976490 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_65cc86a8-23ad-428c-a20a-e932e6cfebd1/nova-api-api/0.log" Mar 20 16:28:54 crc kubenswrapper[4779]: I0320 16:28:54.982331 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_54560a34-b088-437c-8fbc-66c38e38a81a/nova-cell1-conductor-conductor/0.log" Mar 20 16:28:55 crc kubenswrapper[4779]: I0320 16:28:55.167489 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4f67aa28-790a-4491-bac4-a64e60acf7ef/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 16:28:55 crc kubenswrapper[4779]: I0320 16:28:55.504126 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a790ab0f-1ec3-486a-95bf-932d6c088c08/nova-metadata-log/0.log" Mar 20 16:28:55 crc kubenswrapper[4779]: I0320 16:28:55.925897 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a790ab0f-1ec3-486a-95bf-932d6c088c08/nova-metadata-metadata/0.log" Mar 20 16:28:55 crc kubenswrapper[4779]: I0320 16:28:55.956420 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_77e9c453-fbe2-4ad3-a2eb-c11a21ed70e5/nova-scheduler-scheduler/0.log" Mar 20 16:28:55 crc kubenswrapper[4779]: I0320 16:28:55.973731 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0bb35660-3a34-4c16-a943-4375cfe12246/mysql-bootstrap/0.log" Mar 20 16:28:56 crc kubenswrapper[4779]: I0320 16:28:56.175175 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0bb35660-3a34-4c16-a943-4375cfe12246/mysql-bootstrap/0.log" Mar 20 16:28:56 crc kubenswrapper[4779]: I0320 16:28:56.222747 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0bb35660-3a34-4c16-a943-4375cfe12246/galera/0.log" Mar 20 16:28:56 crc kubenswrapper[4779]: I0320 16:28:56.377403 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-tdlvd_69a85922-6ad1-4e70-bbc9-18a0cdc178f1/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:56 crc kubenswrapper[4779]: I0320 16:28:56.435433 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9b8436ea-e4a7-4a6b-a6a2-d9282bda9696/mysql-bootstrap/0.log" Mar 20 16:28:56 crc kubenswrapper[4779]: I0320 16:28:56.629519 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9b8436ea-e4a7-4a6b-a6a2-d9282bda9696/galera/0.log" Mar 20 16:28:56 crc kubenswrapper[4779]: I0320 16:28:56.679697 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9b8436ea-e4a7-4a6b-a6a2-d9282bda9696/mysql-bootstrap/0.log" Mar 20 16:28:56 crc kubenswrapper[4779]: I0320 16:28:56.814189 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5191a987-7d86-4738-b0d2-e56edcff519e/openstackclient/0.log" Mar 20 16:28:56 crc kubenswrapper[4779]: I0320 16:28:56.947452 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-q6sf7_f27ef3ea-fc3e-47ed-8d5a-ddbda6e3927e/openstack-network-exporter/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.037413 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bvnhn_2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3/ovsdb-server-init/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.294140 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bvnhn_2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3/ovsdb-server-init/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.311581 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bvnhn_2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3/ovs-vswitchd/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.324571 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bvnhn_2eeb0ff9-1bf6-4e5b-a11f-c3d85df38cb3/ovsdb-server/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.561505 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s9wt4_73a3debf-8de1-4321-b383-5bebca909a38/ovn-controller/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.736472 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-vlwk8_55a17de8-e79b-4949-8dfa-d5c4f1fd8917/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.765972 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4c00c5fd-27bb-4e67-bbfa-374e073d15df/ovn-northd/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.767834 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4c00c5fd-27bb-4e67-bbfa-374e073d15df/openstack-network-exporter/0.log" Mar 20 16:28:57 crc kubenswrapper[4779]: I0320 16:28:57.959776 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_09f74271-1a65-4b70-a927-a4dd7de65360/openstack-network-exporter/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.030995 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_09f74271-1a65-4b70-a927-a4dd7de65360/ovsdbserver-nb/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.177406 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_baa96376-1415-4a97-853e-cde55a1d6860/openstack-network-exporter/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.178450 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_baa96376-1415-4a97-853e-cde55a1d6860/ovsdbserver-sb/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.653988 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c69fcdb9d-ntwph_a64425ab-a231-4dd1-818e-156d70a0864d/placement-api/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.662403 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81f80e68-614e-43b0-966a-e487faa0db31/init-config-reloader/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.692564 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c69fcdb9d-ntwph_a64425ab-a231-4dd1-818e-156d70a0864d/placement-log/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.879158 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81f80e68-614e-43b0-966a-e487faa0db31/init-config-reloader/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.934304 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81f80e68-614e-43b0-966a-e487faa0db31/config-reloader/0.log" Mar 20 16:28:58 crc kubenswrapper[4779]: I0320 16:28:58.942509 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81f80e68-614e-43b0-966a-e487faa0db31/prometheus/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.030262 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81f80e68-614e-43b0-966a-e487faa0db31/thanos-sidecar/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.227711 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a91eab41-69fe-44e7-a239-6956a6b18dd8/setup-container/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.372656 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a91eab41-69fe-44e7-a239-6956a6b18dd8/rabbitmq/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.390840 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a91eab41-69fe-44e7-a239-6956a6b18dd8/setup-container/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.458399 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9bba48d4-65a1-44b2-b750-6a7f27613e63/setup-container/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.673229 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9bba48d4-65a1-44b2-b750-6a7f27613e63/setup-container/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.691247 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9bba48d4-65a1-44b2-b750-6a7f27613e63/rabbitmq/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.750478 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ct9xk_6f592faf-9ec7-4ae5-8e95-3b45d7de27ce/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.918402 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zlfwz_c6abe891-2e21-41f3-a2eb-738a62807090/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:28:59 crc kubenswrapper[4779]: I0320 16:28:59.981359 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-62rks_e9abd3a1-8135-495d-b4f0-3f3569c40751/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.154156 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-px8s4_4448472a-99b1-4060-aab8-4301c9de37a2/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.257934 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-zggqb_a81a28a7-b38a-4491-ba2a-7de988a7e02f/ssh-known-hosts-edpm-deployment/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.415433 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f8794546c-r7zls_ad4b673b-e65e-49d0-92bd-138da686c2eb/proxy-server/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.512273 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f8794546c-r7zls_ad4b673b-e65e-49d0-92bd-138da686c2eb/proxy-httpd/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.521963 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s9ls7_8a65e85e-076d-42e4-88fb-5bb905893173/swift-ring-rebalance/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.698391 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/account-auditor/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.766421 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/account-replicator/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.789424 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/account-reaper/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.843242 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/account-server/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.920507 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/container-auditor/0.log" Mar 20 16:29:00 crc kubenswrapper[4779]: I0320 16:29:00.985868 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/container-replicator/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.054930 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/container-updater/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.055035 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/container-server/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.145317 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/object-auditor/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.195412 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/object-expirer/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.276315 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/object-replicator/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.286480 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/object-server/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.404742 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/rsync/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.416310 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/object-updater/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.550502 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8ecc2ee-cb89-426e-8ac8-3037c5ca5a1b/swift-recon-cron/0.log" Mar 20 16:29:01 crc kubenswrapper[4779]: I0320 16:29:01.863609 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4ca8f9c7-d49d-4a13-8b2d-99c06523ce13/test-operator-logs-container/0.log" Mar 20 16:29:02 crc kubenswrapper[4779]: I0320 16:29:02.067387 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_19f31371-34ad-444f-98e9-1cd99dbe6b24/tempest-tests-tempest-tests-runner/0.log" Mar 20 16:29:02 crc kubenswrapper[4779]: I0320 16:29:02.096545 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7jplj_277792af-615f-44a7-b90a-3b5041ce1aa2/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:29:02 crc kubenswrapper[4779]: I0320 16:29:02.322950 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9n27c_8e63bf33-6a78-4424-97d5-7eed781817d1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 16:29:02 crc kubenswrapper[4779]: I0320 16:29:02.630905 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_b3d25ade-4950-40ca-8481-f52c063ff998/watcher-applier/0.log" Mar 20 16:29:02 crc kubenswrapper[4779]: I0320 16:29:02.763142 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_ad68dcd5-048d-44fe-a6d7-583068d7b361/watcher-api-log/0.log" Mar 20 16:29:03 crc kubenswrapper[4779]: I0320 16:29:03.320502 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_1ca673b5-8809-4a16-9534-12b58c0f3cb9/watcher-decision-engine/0.log" Mar 20 16:29:04 crc kubenswrapper[4779]: I0320 16:29:04.588919 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_ad68dcd5-048d-44fe-a6d7-583068d7b361/watcher-api/0.log" Mar 20 16:29:04 crc kubenswrapper[4779]: I0320 16:29:04.809943 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:29:04 crc kubenswrapper[4779]: E0320 16:29:04.810194 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:29:16 crc kubenswrapper[4779]: I0320 16:29:16.808497 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:29:16 crc kubenswrapper[4779]: E0320 16:29:16.811006 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:29:18 crc kubenswrapper[4779]: I0320 16:29:18.605022 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_957258f0-7df6-4c8e-9474-8693ab779860/memcached/0.log" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.532332 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-llwh6"] Mar 20 16:29:23 crc kubenswrapper[4779]: E0320 16:29:23.533253 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c157641-dc0d-4c52-a2b7-5bc0236ed1b4" containerName="container-00" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.533267 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c157641-dc0d-4c52-a2b7-5bc0236ed1b4" containerName="container-00" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.533514 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c157641-dc0d-4c52-a2b7-5bc0236ed1b4" containerName="container-00" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.534910 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.548182 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llwh6"] Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.641312 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-utilities\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.641624 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-catalog-content\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.641847 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkk9\" (UniqueName: \"kubernetes.io/projected/3399eac0-7177-46f8-9054-67a3c8debbf7-kube-api-access-qgkk9\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.743972 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkk9\" (UniqueName: \"kubernetes.io/projected/3399eac0-7177-46f8-9054-67a3c8debbf7-kube-api-access-qgkk9\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.744088 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-utilities\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.744178 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-catalog-content\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.744689 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-utilities\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.744707 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-catalog-content\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.766073 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkk9\" (UniqueName: \"kubernetes.io/projected/3399eac0-7177-46f8-9054-67a3c8debbf7-kube-api-access-qgkk9\") pod \"redhat-operators-llwh6\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:23 crc kubenswrapper[4779]: I0320 16:29:23.855328 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:24 crc kubenswrapper[4779]: I0320 16:29:24.454929 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llwh6"] Mar 20 16:29:25 crc kubenswrapper[4779]: I0320 16:29:25.409002 4779 generic.go:334] "Generic (PLEG): container finished" podID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerID="8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe" exitCode=0 Mar 20 16:29:25 crc kubenswrapper[4779]: I0320 16:29:25.409094 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llwh6" event={"ID":"3399eac0-7177-46f8-9054-67a3c8debbf7","Type":"ContainerDied","Data":"8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe"} Mar 20 16:29:25 crc kubenswrapper[4779]: I0320 16:29:25.409449 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llwh6" event={"ID":"3399eac0-7177-46f8-9054-67a3c8debbf7","Type":"ContainerStarted","Data":"17d28bb1bbbe2706a86862a97b55b697b874ba9b6096ccc114c645450c95c390"} Mar 20 16:29:26 crc kubenswrapper[4779]: I0320 16:29:26.423236 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llwh6" event={"ID":"3399eac0-7177-46f8-9054-67a3c8debbf7","Type":"ContainerStarted","Data":"6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab"} Mar 20 16:29:31 crc kubenswrapper[4779]: I0320 16:29:31.809315 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:29:31 crc kubenswrapper[4779]: E0320 16:29:31.809884 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.171007 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w_d0f47857-a2c3-4fe2-9a59-2b178811fda5/util/0.log" Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.393960 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w_d0f47857-a2c3-4fe2-9a59-2b178811fda5/pull/0.log" Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.412139 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w_d0f47857-a2c3-4fe2-9a59-2b178811fda5/util/0.log" Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.433969 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w_d0f47857-a2c3-4fe2-9a59-2b178811fda5/pull/0.log" Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.504919 4779 generic.go:334] "Generic (PLEG): container finished" podID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerID="6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab" exitCode=0 Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.504965 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llwh6" event={"ID":"3399eac0-7177-46f8-9054-67a3c8debbf7","Type":"ContainerDied","Data":"6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab"} Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.610431 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w_d0f47857-a2c3-4fe2-9a59-2b178811fda5/util/0.log" Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.636897 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w_d0f47857-a2c3-4fe2-9a59-2b178811fda5/pull/0.log" Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.693418 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a5d6df43e82bcd9f777035cdd0143ddec736ae8e19d633ac86bc7d1d1wfq6w_d0f47857-a2c3-4fe2-9a59-2b178811fda5/extract/0.log" Mar 20 16:29:35 crc kubenswrapper[4779]: I0320 16:29:35.917310 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-t6fr6_c3e44c59-8e80-4e8c-956f-10b8091f819f/manager/0.log" Mar 20 16:29:36 crc kubenswrapper[4779]: I0320 16:29:36.167943 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-5tdlb_069366bd-aca3-43b9-9335-7d98fb20d4b7/manager/0.log" Mar 20 16:29:36 crc kubenswrapper[4779]: I0320 16:29:36.437347 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-pt5hw_ddee98c8-526f-459c-a9eb-ea96bc062ff5/manager/0.log" Mar 20 16:29:36 crc kubenswrapper[4779]: I0320 16:29:36.485848 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-qgxj7_679a5132-11e9-477f-954d-1eb244f67d9c/manager/0.log" Mar 20 16:29:36 crc kubenswrapper[4779]: I0320 16:29:36.515401 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llwh6" event={"ID":"3399eac0-7177-46f8-9054-67a3c8debbf7","Type":"ContainerStarted","Data":"c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215"} Mar 20 16:29:36 crc kubenswrapper[4779]: I0320 16:29:36.532617 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-llwh6" podStartSLOduration=2.745514109 podStartE2EDuration="13.532598557s" podCreationTimestamp="2026-03-20 16:29:23 +0000 UTC" firstStartedPulling="2026-03-20 16:29:25.411047893 +0000 UTC m=+3982.373563693" lastFinishedPulling="2026-03-20 16:29:36.198132341 +0000 UTC m=+3993.160648141" observedRunningTime="2026-03-20 16:29:36.530843574 +0000 UTC m=+3993.493359374" watchObservedRunningTime="2026-03-20 16:29:36.532598557 +0000 UTC m=+3993.495114357" Mar 20 16:29:36 crc kubenswrapper[4779]: I0320 16:29:36.847168 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-c8w7p_3b1813e8-5330-4de2-ad05-1f56fcc1cfac/manager/0.log" Mar 20 16:29:38 crc kubenswrapper[4779]: I0320 16:29:37.227963 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-t5d5w_3fd922a6-0fbd-4458-b83c-b599e2988c7a/manager/0.log" Mar 20 16:29:38 crc kubenswrapper[4779]: I0320 16:29:38.257513 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-qck9t_d61cd2c5-5321-418f-b506-a14210a24e95/manager/0.log" Mar 20 16:29:38 crc kubenswrapper[4779]: I0320 16:29:38.486243 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-n5x2d_edf58803-5b67-47e2-a6f1-4998820acc34/manager/0.log" Mar 20 16:29:38 crc kubenswrapper[4779]: I0320 16:29:38.576861 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-gqx5g_ccef47f8-908b-4765-abf1-d218024c98bf/manager/0.log" Mar 20 16:29:38 crc kubenswrapper[4779]: I0320 16:29:38.669919 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-ggsnk_3ca155df-e0dd-42b3-92bf-98a7e8037f02/manager/0.log" Mar 20 16:29:38 crc kubenswrapper[4779]: I0320 16:29:38.909214 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-zgd77_84524c2d-5bf3-42ff-ac4e-7d5aea8c9772/manager/0.log" Mar 20 16:29:39 crc kubenswrapper[4779]: I0320 16:29:39.082951 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-8pcbl_b2a2462d-b0d7-43e7-a7ef-93a8a4e84113/manager/0.log" Mar 20 16:29:39 crc kubenswrapper[4779]: I0320 16:29:39.245252 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-cqzxh_4ee9a64d-677e-4d45-8032-f33a4e91ee2c/manager/0.log" Mar 20 16:29:39 crc kubenswrapper[4779]: I0320 16:29:39.315864 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-h5k2b_87fb9e77-cea0-478a-8cd9-c8c15d42065e/manager/0.log" Mar 20 16:29:39 crc kubenswrapper[4779]: I0320 16:29:39.521722 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-qlcwt_22109be9-adb4-4573-9075-501c52043d47/manager/0.log" Mar 20 16:29:39 crc kubenswrapper[4779]: I0320 16:29:39.702392 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65c968fb4-5gx4d_8b904c14-dc4f-41a0-afab-990ff14e74f7/operator/0.log" Mar 20 16:29:40 crc kubenswrapper[4779]: I0320 16:29:40.042494 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2nnq6_cf023fa6-f2a8-411b-820c-6b3822b8ce68/registry-server/0.log" Mar 20 16:29:40 crc kubenswrapper[4779]: I0320 16:29:40.324691 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-qbcd4_720d6c6c-aa5e-4fb3-b7a5-148224dff316/manager/0.log" Mar 20 16:29:40 crc kubenswrapper[4779]: I0320 16:29:40.421669 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-xg8mw_67a92119-db58-49a6-a7da-c63c135d6956/manager/0.log" Mar 20 16:29:40 crc kubenswrapper[4779]: I0320 16:29:40.725715 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mq9mf_7de4ed63-6a6f-420f-a769-eeafd4a87eef/operator/0.log" Mar 20 16:29:40 crc kubenswrapper[4779]: I0320 16:29:40.956941 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-pm9mf_6d5f5fdf-8757-47c9-9c64-376367ba2bfb/manager/0.log" Mar 20 16:29:41 crc kubenswrapper[4779]: I0320 16:29:41.107701 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b5784dbfc-h2nhw_39438f44-0aea-412c-84e3-4d013dadd573/manager/0.log" Mar 20 16:29:41 crc kubenswrapper[4779]: I0320 16:29:41.206304 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-xtbhm_54f4f092-31d9-47b3-a791-b84c228f4024/manager/0.log" Mar 20 16:29:41 crc kubenswrapper[4779]: I0320 16:29:41.309082 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-7vzrl_a72e6f4d-3661-4b7e-92dc-7443c65304e6/manager/0.log" Mar 20 16:29:41 crc kubenswrapper[4779]: I0320 16:29:41.471668 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-78b4b86d76-x75t6_f5288943-ed82-43b2-ab12-bab71febb2d9/manager/0.log" Mar 20 16:29:43 crc kubenswrapper[4779]: I0320 16:29:43.815947 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:29:43 crc kubenswrapper[4779]: E0320 16:29:43.817789 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:29:43 crc kubenswrapper[4779]: I0320 16:29:43.855528 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:43 crc kubenswrapper[4779]: I0320 16:29:43.855770 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:29:44 crc kubenswrapper[4779]: I0320 16:29:44.906989 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llwh6" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="registry-server" probeResult="failure" output=< Mar 20 16:29:44 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 16:29:44 crc kubenswrapper[4779]: > Mar 20 16:29:54 crc kubenswrapper[4779]: I0320 16:29:54.899854 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llwh6" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="registry-server" probeResult="failure" output=< Mar 20 16:29:54 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 16:29:54 crc kubenswrapper[4779]: > Mar 20 16:29:55 crc kubenswrapper[4779]: I0320 16:29:55.808631 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:29:55 crc kubenswrapper[4779]: E0320 16:29:55.809202 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.177007 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg"] Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.178742 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.183805 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.187172 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.284612 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tbp5\" (UniqueName: \"kubernetes.io/projected/7e1f7276-938d-45ba-ae75-aba2113a04ef-kube-api-access-7tbp5\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.284672 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f7276-938d-45ba-ae75-aba2113a04ef-config-volume\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.284758 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f7276-938d-45ba-ae75-aba2113a04ef-secret-volume\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.292655 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg"] Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.314850 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567070-qwnmr"] Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.317174 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-qwnmr" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.320100 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.770418 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.772074 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.783270 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tbp5\" (UniqueName: \"kubernetes.io/projected/7e1f7276-938d-45ba-ae75-aba2113a04ef-kube-api-access-7tbp5\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.784776 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f7276-938d-45ba-ae75-aba2113a04ef-config-volume\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.785023 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f7276-938d-45ba-ae75-aba2113a04ef-secret-volume\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.786640 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr5sd\" (UniqueName: \"kubernetes.io/projected/e3b97042-5c7d-47fe-a06f-83ed3bd32049-kube-api-access-xr5sd\") pod \"auto-csr-approver-29567070-qwnmr\" (UID: \"e3b97042-5c7d-47fe-a06f-83ed3bd32049\") " pod="openshift-infra/auto-csr-approver-29567070-qwnmr" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.786693 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-qwnmr"] Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.788255 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f7276-938d-45ba-ae75-aba2113a04ef-config-volume\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.815247 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f7276-938d-45ba-ae75-aba2113a04ef-secret-volume\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.847663 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tbp5\" (UniqueName: \"kubernetes.io/projected/7e1f7276-938d-45ba-ae75-aba2113a04ef-kube-api-access-7tbp5\") pod \"collect-profiles-29567070-hflcg\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.888788 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr5sd\" (UniqueName: \"kubernetes.io/projected/e3b97042-5c7d-47fe-a06f-83ed3bd32049-kube-api-access-xr5sd\") pod \"auto-csr-approver-29567070-qwnmr\" (UID: \"e3b97042-5c7d-47fe-a06f-83ed3bd32049\") " pod="openshift-infra/auto-csr-approver-29567070-qwnmr" Mar 20 16:30:00 crc kubenswrapper[4779]: I0320 16:30:00.910151 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr5sd\" (UniqueName: \"kubernetes.io/projected/e3b97042-5c7d-47fe-a06f-83ed3bd32049-kube-api-access-xr5sd\") pod \"auto-csr-approver-29567070-qwnmr\" (UID: \"e3b97042-5c7d-47fe-a06f-83ed3bd32049\") " pod="openshift-infra/auto-csr-approver-29567070-qwnmr" Mar 20 16:30:01 crc kubenswrapper[4779]: I0320 16:30:01.106605 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:01 crc kubenswrapper[4779]: I0320 16:30:01.208453 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-qwnmr" Mar 20 16:30:01 crc kubenswrapper[4779]: I0320 16:30:01.628393 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg"] Mar 20 16:30:01 crc kubenswrapper[4779]: W0320 16:30:01.637637 4779 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e1f7276_938d_45ba_ae75_aba2113a04ef.slice/crio-547896bc51d9924187a67f77bf92d713d13036155a06c619f6f713ab5b4b7ba0 WatchSource:0}: Error finding container 547896bc51d9924187a67f77bf92d713d13036155a06c619f6f713ab5b4b7ba0: Status 404 returned error can't find the container with id 547896bc51d9924187a67f77bf92d713d13036155a06c619f6f713ab5b4b7ba0 Mar 20 16:30:01 crc kubenswrapper[4779]: I0320 16:30:01.766837 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-qwnmr"] Mar 20 16:30:01 crc kubenswrapper[4779]: I0320 16:30:01.847309 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-qwnmr" event={"ID":"e3b97042-5c7d-47fe-a06f-83ed3bd32049","Type":"ContainerStarted","Data":"b37ac0dd8b29718ae3bcf09bf1a732459b70b20feaba7d4f71c6af2a33fa95e6"} Mar 20 16:30:01 crc kubenswrapper[4779]: I0320 16:30:01.849272 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" event={"ID":"7e1f7276-938d-45ba-ae75-aba2113a04ef","Type":"ContainerStarted","Data":"a8dc99b18b6bce80f39f06e90d79d707cf16a787fd036f863ce57b0f8b810b43"} Mar 20 16:30:01 crc kubenswrapper[4779]: I0320 16:30:01.849317 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" event={"ID":"7e1f7276-938d-45ba-ae75-aba2113a04ef","Type":"ContainerStarted","Data":"547896bc51d9924187a67f77bf92d713d13036155a06c619f6f713ab5b4b7ba0"} Mar 20 16:30:01 crc kubenswrapper[4779]: I0320 16:30:01.870021 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" podStartSLOduration=1.870002916 podStartE2EDuration="1.870002916s" podCreationTimestamp="2026-03-20 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:30:01.863390964 +0000 UTC m=+4018.825906764" watchObservedRunningTime="2026-03-20 16:30:01.870002916 +0000 UTC m=+4018.832518716" Mar 20 16:30:02 crc kubenswrapper[4779]: I0320 16:30:02.859577 4779 generic.go:334] "Generic (PLEG): container finished" podID="7e1f7276-938d-45ba-ae75-aba2113a04ef" containerID="a8dc99b18b6bce80f39f06e90d79d707cf16a787fd036f863ce57b0f8b810b43" exitCode=0 Mar 20 16:30:02 crc kubenswrapper[4779]: I0320 16:30:02.859631 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" event={"ID":"7e1f7276-938d-45ba-ae75-aba2113a04ef","Type":"ContainerDied","Data":"a8dc99b18b6bce80f39f06e90d79d707cf16a787fd036f863ce57b0f8b810b43"} Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.313360 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.356213 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tbp5\" (UniqueName: \"kubernetes.io/projected/7e1f7276-938d-45ba-ae75-aba2113a04ef-kube-api-access-7tbp5\") pod \"7e1f7276-938d-45ba-ae75-aba2113a04ef\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.356337 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f7276-938d-45ba-ae75-aba2113a04ef-secret-volume\") pod \"7e1f7276-938d-45ba-ae75-aba2113a04ef\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.356465 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f7276-938d-45ba-ae75-aba2113a04ef-config-volume\") pod \"7e1f7276-938d-45ba-ae75-aba2113a04ef\" (UID: \"7e1f7276-938d-45ba-ae75-aba2113a04ef\") " Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.357199 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1f7276-938d-45ba-ae75-aba2113a04ef-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e1f7276-938d-45ba-ae75-aba2113a04ef" (UID: "7e1f7276-938d-45ba-ae75-aba2113a04ef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.364780 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1f7276-938d-45ba-ae75-aba2113a04ef-kube-api-access-7tbp5" (OuterVolumeSpecName: "kube-api-access-7tbp5") pod "7e1f7276-938d-45ba-ae75-aba2113a04ef" (UID: "7e1f7276-938d-45ba-ae75-aba2113a04ef"). InnerVolumeSpecName "kube-api-access-7tbp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.364918 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1f7276-938d-45ba-ae75-aba2113a04ef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e1f7276-938d-45ba-ae75-aba2113a04ef" (UID: "7e1f7276-938d-45ba-ae75-aba2113a04ef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.458309 4779 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1f7276-938d-45ba-ae75-aba2113a04ef-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.458343 4779 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1f7276-938d-45ba-ae75-aba2113a04ef-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.458353 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tbp5\" (UniqueName: \"kubernetes.io/projected/7e1f7276-938d-45ba-ae75-aba2113a04ef-kube-api-access-7tbp5\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.703985 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s"] Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.712426 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-g6d8s"] Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.880592 4779 generic.go:334] "Generic (PLEG): container finished" podID="e3b97042-5c7d-47fe-a06f-83ed3bd32049" containerID="6cbbf534388b2499e5be8799926a6c0dc9db2d3f53e1b0db1da60e1c3bf471d7" exitCode=0 Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.880678 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-qwnmr" event={"ID":"e3b97042-5c7d-47fe-a06f-83ed3bd32049","Type":"ContainerDied","Data":"6cbbf534388b2499e5be8799926a6c0dc9db2d3f53e1b0db1da60e1c3bf471d7"} Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.882425 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" event={"ID":"7e1f7276-938d-45ba-ae75-aba2113a04ef","Type":"ContainerDied","Data":"547896bc51d9924187a67f77bf92d713d13036155a06c619f6f713ab5b4b7ba0"} Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.882456 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="547896bc51d9924187a67f77bf92d713d13036155a06c619f6f713ab5b4b7ba0" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.882482 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-hflcg" Mar 20 16:30:04 crc kubenswrapper[4779]: I0320 16:30:04.912842 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llwh6" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="registry-server" probeResult="failure" output=< Mar 20 16:30:04 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 16:30:04 crc kubenswrapper[4779]: > Mar 20 16:30:05 crc kubenswrapper[4779]: I0320 16:30:05.825410 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a850ec0-7410-472b-b4da-19f990f2187f" path="/var/lib/kubelet/pods/1a850ec0-7410-472b-b4da-19f990f2187f/volumes" Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.066751 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-cwvxv_849b44ce-b317-4a93-a453-4f36844fbff8/control-plane-machine-set-operator/0.log" Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.109605 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6mhgz_3987175b-df07-4970-9c38-fd7cc25a2586/kube-rbac-proxy/0.log" Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.327469 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6mhgz_3987175b-df07-4970-9c38-fd7cc25a2586/machine-api-operator/0.log" Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.359226 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-qwnmr" Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.506018 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr5sd\" (UniqueName: \"kubernetes.io/projected/e3b97042-5c7d-47fe-a06f-83ed3bd32049-kube-api-access-xr5sd\") pod \"e3b97042-5c7d-47fe-a06f-83ed3bd32049\" (UID: \"e3b97042-5c7d-47fe-a06f-83ed3bd32049\") " Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.513093 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b97042-5c7d-47fe-a06f-83ed3bd32049-kube-api-access-xr5sd" (OuterVolumeSpecName: "kube-api-access-xr5sd") pod "e3b97042-5c7d-47fe-a06f-83ed3bd32049" (UID: "e3b97042-5c7d-47fe-a06f-83ed3bd32049"). InnerVolumeSpecName "kube-api-access-xr5sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.608533 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr5sd\" (UniqueName: \"kubernetes.io/projected/e3b97042-5c7d-47fe-a06f-83ed3bd32049-kube-api-access-xr5sd\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.901276 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-qwnmr" event={"ID":"e3b97042-5c7d-47fe-a06f-83ed3bd32049","Type":"ContainerDied","Data":"b37ac0dd8b29718ae3bcf09bf1a732459b70b20feaba7d4f71c6af2a33fa95e6"} Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.901553 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b37ac0dd8b29718ae3bcf09bf1a732459b70b20feaba7d4f71c6af2a33fa95e6" Mar 20 16:30:06 crc kubenswrapper[4779]: I0320 16:30:06.901351 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-qwnmr" Mar 20 16:30:07 crc kubenswrapper[4779]: I0320 16:30:07.414296 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-8775c"] Mar 20 16:30:07 crc kubenswrapper[4779]: I0320 16:30:07.423356 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-8775c"] Mar 20 16:30:07 crc kubenswrapper[4779]: I0320 16:30:07.830947 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00295003-7d10-44ad-a4a9-f77ad233896f" path="/var/lib/kubelet/pods/00295003-7d10-44ad-a4a9-f77ad233896f/volumes" Mar 20 16:30:10 crc kubenswrapper[4779]: I0320 16:30:10.809696 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:30:10 crc kubenswrapper[4779]: E0320 16:30:10.810442 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:30:14 crc kubenswrapper[4779]: I0320 16:30:14.900732 4779 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llwh6" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="registry-server" probeResult="failure" output=< Mar 20 16:30:14 crc kubenswrapper[4779]: timeout: failed to connect service ":50051" within 1s Mar 20 16:30:14 crc kubenswrapper[4779]: > Mar 20 16:30:18 crc kubenswrapper[4779]: I0320 16:30:18.335315 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-47n9f_7df1c661-b257-49ff-8a2c-faa47c38a23e/cert-manager-controller/0.log" Mar 20 16:30:18 crc kubenswrapper[4779]: I0320 16:30:18.539829 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n9m75_1597fd09-4ed2-4ba6-95a8-25acd4b4bd4a/cert-manager-cainjector/0.log" Mar 20 16:30:18 crc kubenswrapper[4779]: I0320 16:30:18.648771 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-lzpdk_c3ecb063-b3b2-4585-935f-dbe6fd4147ac/cert-manager-webhook/0.log" Mar 20 16:30:21 crc kubenswrapper[4779]: I0320 16:30:21.809709 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:30:21 crc kubenswrapper[4779]: E0320 16:30:21.810340 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:30:23 crc kubenswrapper[4779]: I0320 16:30:23.905548 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:30:23 crc kubenswrapper[4779]: I0320 16:30:23.959881 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:30:24 crc kubenswrapper[4779]: I0320 16:30:24.746580 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llwh6"] Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.227161 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-llwh6" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="registry-server" containerID="cri-o://c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215" gracePeriod=2 Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.729970 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.847893 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-catalog-content\") pod \"3399eac0-7177-46f8-9054-67a3c8debbf7\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.847936 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-utilities\") pod \"3399eac0-7177-46f8-9054-67a3c8debbf7\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.847975 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgkk9\" (UniqueName: \"kubernetes.io/projected/3399eac0-7177-46f8-9054-67a3c8debbf7-kube-api-access-qgkk9\") pod \"3399eac0-7177-46f8-9054-67a3c8debbf7\" (UID: \"3399eac0-7177-46f8-9054-67a3c8debbf7\") " Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.848989 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-utilities" (OuterVolumeSpecName: "utilities") pod "3399eac0-7177-46f8-9054-67a3c8debbf7" (UID: "3399eac0-7177-46f8-9054-67a3c8debbf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.857448 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3399eac0-7177-46f8-9054-67a3c8debbf7-kube-api-access-qgkk9" (OuterVolumeSpecName: "kube-api-access-qgkk9") pod "3399eac0-7177-46f8-9054-67a3c8debbf7" (UID: "3399eac0-7177-46f8-9054-67a3c8debbf7"). InnerVolumeSpecName "kube-api-access-qgkk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.950666 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.950813 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgkk9\" (UniqueName: \"kubernetes.io/projected/3399eac0-7177-46f8-9054-67a3c8debbf7-kube-api-access-qgkk9\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:25 crc kubenswrapper[4779]: I0320 16:30:25.972798 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3399eac0-7177-46f8-9054-67a3c8debbf7" (UID: "3399eac0-7177-46f8-9054-67a3c8debbf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.052816 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3399eac0-7177-46f8-9054-67a3c8debbf7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.244282 4779 generic.go:334] "Generic (PLEG): container finished" podID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerID="c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215" exitCode=0 Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.244351 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llwh6" event={"ID":"3399eac0-7177-46f8-9054-67a3c8debbf7","Type":"ContainerDied","Data":"c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215"} Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.244380 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llwh6" event={"ID":"3399eac0-7177-46f8-9054-67a3c8debbf7","Type":"ContainerDied","Data":"17d28bb1bbbe2706a86862a97b55b697b874ba9b6096ccc114c645450c95c390"} Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.244398 4779 scope.go:117] "RemoveContainer" containerID="c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.244429 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llwh6" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.279790 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llwh6"] Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.291653 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-llwh6"] Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.292368 4779 scope.go:117] "RemoveContainer" containerID="6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.316588 4779 scope.go:117] "RemoveContainer" containerID="8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.357286 4779 scope.go:117] "RemoveContainer" containerID="c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215" Mar 20 16:30:26 crc kubenswrapper[4779]: E0320 16:30:26.357626 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215\": container with ID starting with c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215 not found: ID does not exist" containerID="c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.357666 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215"} err="failed to get container status \"c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215\": rpc error: code = NotFound desc = could not find container \"c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215\": container with ID starting with c6fa973c742723216030fb7f31d1c50ad523d1767b59ca6a63b770d4b64e7215 not found: ID does not exist" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.357690 4779 scope.go:117] "RemoveContainer" containerID="6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab" Mar 20 16:30:26 crc kubenswrapper[4779]: E0320 16:30:26.357919 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab\": container with ID starting with 6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab not found: ID does not exist" containerID="6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.357957 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab"} err="failed to get container status \"6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab\": rpc error: code = NotFound desc = could not find container \"6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab\": container with ID starting with 6f23aeb72ae88af77417b708a70e33838e966366e719912202ef259d1ec7f1ab not found: ID does not exist" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.357984 4779 scope.go:117] "RemoveContainer" containerID="8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe" Mar 20 16:30:26 crc kubenswrapper[4779]: E0320 16:30:26.358297 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe\": container with ID starting with 8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe not found: ID does not exist" containerID="8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe" Mar 20 16:30:26 crc kubenswrapper[4779]: I0320 16:30:26.358319 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe"} err="failed to get container status \"8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe\": rpc error: code = NotFound desc = could not find container \"8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe\": container with ID starting with 8189182c33ca07926b17525a752448b7897f36eb8a193d11491ac1ac7732ecfe not found: ID does not exist" Mar 20 16:30:27 crc kubenswrapper[4779]: I0320 16:30:27.820897 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" path="/var/lib/kubelet/pods/3399eac0-7177-46f8-9054-67a3c8debbf7/volumes" Mar 20 16:30:31 crc kubenswrapper[4779]: I0320 16:30:31.505499 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-zjps6_03f98c5f-95c6-4714-a883-76c1eb8ff7f1/nmstate-console-plugin/0.log" Mar 20 16:30:31 crc kubenswrapper[4779]: I0320 16:30:31.695084 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bg6f8_436fb992-2d18-438e-b2e9-489bb530c9ba/nmstate-handler/0.log" Mar 20 16:30:31 crc kubenswrapper[4779]: I0320 16:30:31.785609 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-x8nlg_35df524e-f5d4-4360-93e1-d76dde302069/kube-rbac-proxy/0.log" Mar 20 16:30:31 crc kubenswrapper[4779]: I0320 16:30:31.925888 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-x8nlg_35df524e-f5d4-4360-93e1-d76dde302069/nmstate-metrics/0.log" Mar 20 16:30:31 crc kubenswrapper[4779]: I0320 16:30:31.931940 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-s5ctz_504ba887-92f4-4a4f-b44c-c416944075fe/nmstate-operator/0.log" Mar 20 16:30:32 crc kubenswrapper[4779]: I0320 16:30:32.048390 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-nq7c2_f630bb2c-91d5-434f-9a06-b933086a65fc/nmstate-webhook/0.log" Mar 20 16:30:34 crc kubenswrapper[4779]: I0320 16:30:34.808408 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:30:34 crc kubenswrapper[4779]: E0320 16:30:34.809125 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:30:45 crc kubenswrapper[4779]: I0320 16:30:45.363161 4779 scope.go:117] "RemoveContainer" containerID="7b2c3b08711e4854503ae839e1d3b2747112170ced23fc5279870d5de81a3953" Mar 20 16:30:45 crc kubenswrapper[4779]: I0320 16:30:45.394936 4779 scope.go:117] "RemoveContainer" containerID="2fea61cbcb277f267c820de5912eb2e153b9ab6348df4f067c4a083f76795624" Mar 20 16:30:45 crc kubenswrapper[4779]: I0320 16:30:45.616329 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-sqzkm_569c99c8-036a-41d3-9068-e893c1e067fb/prometheus-operator/0.log" Mar 20 16:30:45 crc kubenswrapper[4779]: I0320 16:30:45.794449 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t_aa1321a8-47aa-4e23-88a9-9fbc2eabe628/prometheus-operator-admission-webhook/0.log" Mar 20 16:30:45 crc kubenswrapper[4779]: I0320 16:30:45.814020 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:30:45 crc kubenswrapper[4779]: E0320 16:30:45.814699 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:30:45 crc kubenswrapper[4779]: I0320 16:30:45.870753 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj_30bff49f-d894-4d30-84f6-b845f0f3dbe2/prometheus-operator-admission-webhook/0.log" Mar 20 16:30:46 crc kubenswrapper[4779]: I0320 16:30:46.026139 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-rv4tl_5f1f80bc-9361-4642-84e6-99e781ac2fad/operator/0.log" Mar 20 16:30:46 crc kubenswrapper[4779]: I0320 16:30:46.119203 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-b59bd5fff-kt6dg_8cf9bae4-474b-4dcd-9716-840649b33f8b/perses-operator/0.log" Mar 20 16:30:57 crc kubenswrapper[4779]: I0320 16:30:57.810726 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:30:58 crc kubenswrapper[4779]: I0320 16:30:58.567254 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"bae06906235ec92fdf13286c3daf0b54ffa138e8c7e77f0062cb5ff877c3d7b6"} Mar 20 16:30:59 crc kubenswrapper[4779]: I0320 16:30:59.637302 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-d7l7m_6a069bff-512f-4d9a-931b-f558cea5f3b7/kube-rbac-proxy/0.log" Mar 20 16:30:59 crc kubenswrapper[4779]: I0320 16:30:59.783428 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-d7l7m_6a069bff-512f-4d9a-931b-f558cea5f3b7/controller/0.log" Mar 20 16:30:59 crc kubenswrapper[4779]: I0320 16:30:59.896899 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-frr-files/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.055654 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-frr-files/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.111809 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-reloader/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.154616 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-metrics/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.172157 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-reloader/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.311572 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-reloader/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.326758 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-frr-files/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.370397 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-metrics/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.393859 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-metrics/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.542491 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-metrics/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.555019 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-frr-files/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.559418 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/cp-reloader/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.559703 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/controller/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.757775 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/kube-rbac-proxy/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.762599 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/kube-rbac-proxy-frr/0.log" Mar 20 16:31:00 crc kubenswrapper[4779]: I0320 16:31:00.768206 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/frr-metrics/0.log" Mar 20 16:31:01 crc kubenswrapper[4779]: I0320 16:31:01.003001 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-7sp5x_78b75cd5-cd83-4a96-9762-95e11ae40119/frr-k8s-webhook-server/0.log" Mar 20 16:31:01 crc kubenswrapper[4779]: I0320 16:31:01.009400 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/reloader/0.log" Mar 20 16:31:01 crc kubenswrapper[4779]: I0320 16:31:01.667202 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-bdf54d65-pvrsc_174f21d6-ef49-44fa-934c-f9823ffcc511/manager/0.log" Mar 20 16:31:01 crc kubenswrapper[4779]: I0320 16:31:01.788160 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5d49844ddb-kx4nw_70059cf3-c8a9-4446-a88d-070f8e7bdf8d/webhook-server/0.log" Mar 20 16:31:01 crc kubenswrapper[4779]: I0320 16:31:01.891392 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8h6gl_38d17ea2-3f89-40c5-bf9d-e950220427b5/kube-rbac-proxy/0.log" Mar 20 16:31:02 crc kubenswrapper[4779]: I0320 16:31:02.394581 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgcmm_18e6cd6e-0782-4b72-a968-bc2e1e6d027b/frr/0.log" Mar 20 16:31:02 crc kubenswrapper[4779]: I0320 16:31:02.475861 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8h6gl_38d17ea2-3f89-40c5-bf9d-e950220427b5/speaker/0.log" Mar 20 16:31:15 crc kubenswrapper[4779]: I0320 16:31:15.432708 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n_1bffdf90-d3df-4cfb-8484-a70e0c418f5c/util/0.log" Mar 20 16:31:15 crc kubenswrapper[4779]: I0320 16:31:15.641948 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n_1bffdf90-d3df-4cfb-8484-a70e0c418f5c/pull/0.log" Mar 20 16:31:15 crc kubenswrapper[4779]: I0320 16:31:15.675785 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n_1bffdf90-d3df-4cfb-8484-a70e0c418f5c/util/0.log" Mar 20 16:31:15 crc kubenswrapper[4779]: I0320 16:31:15.720868 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n_1bffdf90-d3df-4cfb-8484-a70e0c418f5c/pull/0.log" Mar 20 16:31:15 crc kubenswrapper[4779]: I0320 16:31:15.887426 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n_1bffdf90-d3df-4cfb-8484-a70e0c418f5c/util/0.log" Mar 20 16:31:15 crc kubenswrapper[4779]: I0320 16:31:15.911195 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n_1bffdf90-d3df-4cfb-8484-a70e0c418f5c/extract/0.log" Mar 20 16:31:15 crc kubenswrapper[4779]: I0320 16:31:15.951267 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874nm79n_1bffdf90-d3df-4cfb-8484-a70e0c418f5c/pull/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.064096 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk_fae6f408-d45b-402b-9956-d903a90a49ac/util/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.235542 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk_fae6f408-d45b-402b-9956-d903a90a49ac/pull/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.245415 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk_fae6f408-d45b-402b-9956-d903a90a49ac/util/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.297243 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk_fae6f408-d45b-402b-9956-d903a90a49ac/pull/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.582282 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk_fae6f408-d45b-402b-9956-d903a90a49ac/extract/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.655887 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk_fae6f408-d45b-402b-9956-d903a90a49ac/util/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.738023 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bcmvk_fae6f408-d45b-402b-9956-d903a90a49ac/pull/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.846053 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5_11a586de-ae83-4a62-92c7-51c09b869b36/util/0.log" Mar 20 16:31:16 crc kubenswrapper[4779]: I0320 16:31:16.998326 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5_11a586de-ae83-4a62-92c7-51c09b869b36/pull/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.036304 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5_11a586de-ae83-4a62-92c7-51c09b869b36/pull/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.036683 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5_11a586de-ae83-4a62-92c7-51c09b869b36/util/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.190401 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5_11a586de-ae83-4a62-92c7-51c09b869b36/extract/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.220260 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5_11a586de-ae83-4a62-92c7-51c09b869b36/util/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.223616 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726m2pc5_11a586de-ae83-4a62-92c7-51c09b869b36/pull/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.385346 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cj6mh_ecd35186-bfd4-47da-b12b-8073d45e78f7/extract-utilities/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.572373 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cj6mh_ecd35186-bfd4-47da-b12b-8073d45e78f7/extract-utilities/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.573321 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cj6mh_ecd35186-bfd4-47da-b12b-8073d45e78f7/extract-content/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.580448 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cj6mh_ecd35186-bfd4-47da-b12b-8073d45e78f7/extract-content/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.768835 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cj6mh_ecd35186-bfd4-47da-b12b-8073d45e78f7/extract-utilities/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.803228 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cj6mh_ecd35186-bfd4-47da-b12b-8073d45e78f7/extract-content/0.log" Mar 20 16:31:17 crc kubenswrapper[4779]: I0320 16:31:17.995033 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-df7rk_9d3b26c1-8c12-4764-a415-0e7db617947b/extract-utilities/0.log" Mar 20 16:31:18 crc kubenswrapper[4779]: I0320 16:31:18.307695 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-df7rk_9d3b26c1-8c12-4764-a415-0e7db617947b/extract-content/0.log" Mar 20 16:31:18 crc kubenswrapper[4779]: I0320 16:31:18.321362 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-df7rk_9d3b26c1-8c12-4764-a415-0e7db617947b/extract-content/0.log" Mar 20 16:31:18 crc kubenswrapper[4779]: I0320 16:31:18.328290 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-df7rk_9d3b26c1-8c12-4764-a415-0e7db617947b/extract-utilities/0.log" Mar 20 16:31:18 crc kubenswrapper[4779]: I0320 16:31:18.332092 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cj6mh_ecd35186-bfd4-47da-b12b-8073d45e78f7/registry-server/0.log" Mar 20 16:31:18 crc kubenswrapper[4779]: I0320 16:31:18.459136 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-df7rk_9d3b26c1-8c12-4764-a415-0e7db617947b/extract-utilities/0.log" Mar 20 16:31:18 crc kubenswrapper[4779]: I0320 16:31:18.493944 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-df7rk_9d3b26c1-8c12-4764-a415-0e7db617947b/extract-content/0.log" Mar 20 16:31:18 crc kubenswrapper[4779]: I0320 16:31:18.731432 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9wb77_7231ee4d-4e95-48bd-8f2a-be66eccaf6a1/marketplace-operator/0.log" Mar 20 16:31:18 crc kubenswrapper[4779]: I0320 16:31:18.794998 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8pg5_11d8d242-c472-4367-9462-1d55253e19b0/extract-utilities/0.log" Mar 20 16:31:19 crc kubenswrapper[4779]: I0320 16:31:19.120978 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8pg5_11d8d242-c472-4367-9462-1d55253e19b0/extract-content/0.log" Mar 20 16:31:19 crc kubenswrapper[4779]: I0320 16:31:19.126324 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8pg5_11d8d242-c472-4367-9462-1d55253e19b0/extract-utilities/0.log" Mar 20 16:31:19 crc kubenswrapper[4779]: I0320 16:31:19.202554 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8pg5_11d8d242-c472-4367-9462-1d55253e19b0/extract-content/0.log" Mar 20 16:31:19 crc kubenswrapper[4779]: I0320 16:31:19.202756 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-df7rk_9d3b26c1-8c12-4764-a415-0e7db617947b/registry-server/0.log" Mar 20 16:31:19 crc kubenswrapper[4779]: I0320 16:31:19.370324 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8pg5_11d8d242-c472-4367-9462-1d55253e19b0/extract-content/0.log" Mar 20 16:31:19 crc kubenswrapper[4779]: I0320 16:31:19.436981 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8pg5_11d8d242-c472-4367-9462-1d55253e19b0/extract-utilities/0.log" Mar 20 16:31:19 crc kubenswrapper[4779]: I0320 16:31:19.588818 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8pg5_11d8d242-c472-4367-9462-1d55253e19b0/registry-server/0.log" Mar 20 16:31:19 crc kubenswrapper[4779]: I0320 16:31:19.621304 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmv6d_41c24f7d-d859-4eb5-b775-5893df11cf84/extract-utilities/0.log" Mar 20 16:31:20 crc kubenswrapper[4779]: I0320 16:31:20.248355 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmv6d_41c24f7d-d859-4eb5-b775-5893df11cf84/extract-content/0.log" Mar 20 16:31:20 crc kubenswrapper[4779]: I0320 16:31:20.267801 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmv6d_41c24f7d-d859-4eb5-b775-5893df11cf84/extract-utilities/0.log" Mar 20 16:31:20 crc kubenswrapper[4779]: I0320 16:31:20.417652 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmv6d_41c24f7d-d859-4eb5-b775-5893df11cf84/extract-content/0.log" Mar 20 16:31:20 crc kubenswrapper[4779]: I0320 16:31:20.610317 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmv6d_41c24f7d-d859-4eb5-b775-5893df11cf84/extract-utilities/0.log" Mar 20 16:31:20 crc kubenswrapper[4779]: I0320 16:31:20.653780 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmv6d_41c24f7d-d859-4eb5-b775-5893df11cf84/extract-content/0.log" Mar 20 16:31:21 crc kubenswrapper[4779]: I0320 16:31:21.234055 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dmv6d_41c24f7d-d859-4eb5-b775-5893df11cf84/registry-server/0.log" Mar 20 16:31:33 crc kubenswrapper[4779]: I0320 16:31:33.998159 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-sqzkm_569c99c8-036a-41d3-9068-e893c1e067fb/prometheus-operator/0.log" Mar 20 16:31:34 crc kubenswrapper[4779]: I0320 16:31:34.022336 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-546b9f4886-f8l8t_aa1321a8-47aa-4e23-88a9-9fbc2eabe628/prometheus-operator-admission-webhook/0.log" Mar 20 16:31:34 crc kubenswrapper[4779]: I0320 16:31:34.050921 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-546b9f4886-nxpvj_30bff49f-d894-4d30-84f6-b845f0f3dbe2/prometheus-operator-admission-webhook/0.log" Mar 20 16:31:34 crc kubenswrapper[4779]: I0320 16:31:34.210977 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-rv4tl_5f1f80bc-9361-4642-84e6-99e781ac2fad/operator/0.log" Mar 20 16:31:34 crc kubenswrapper[4779]: I0320 16:31:34.224585 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-b59bd5fff-kt6dg_8cf9bae4-474b-4dcd-9716-840649b33f8b/perses-operator/0.log" Mar 20 16:31:46 crc kubenswrapper[4779]: E0320 16:31:46.795071 4779 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.219:33966->38.102.83.219:46679: write tcp 38.102.83.219:33966->38.102.83.219:46679: write: broken pipe Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.147734 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567072-j6vqr"] Mar 20 16:32:00 crc kubenswrapper[4779]: E0320 16:32:00.148707 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="registry-server" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.148724 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="registry-server" Mar 20 16:32:00 crc kubenswrapper[4779]: E0320 16:32:00.148748 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b97042-5c7d-47fe-a06f-83ed3bd32049" containerName="oc" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.148755 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b97042-5c7d-47fe-a06f-83ed3bd32049" containerName="oc" Mar 20 16:32:00 crc kubenswrapper[4779]: E0320 16:32:00.148770 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="extract-utilities" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.148777 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="extract-utilities" Mar 20 16:32:00 crc kubenswrapper[4779]: E0320 16:32:00.148805 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1f7276-938d-45ba-ae75-aba2113a04ef" containerName="collect-profiles" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.148812 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1f7276-938d-45ba-ae75-aba2113a04ef" containerName="collect-profiles" Mar 20 16:32:00 crc kubenswrapper[4779]: E0320 16:32:00.148869 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="extract-content" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.148878 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="extract-content" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.149089 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b97042-5c7d-47fe-a06f-83ed3bd32049" containerName="oc" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.149147 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1f7276-938d-45ba-ae75-aba2113a04ef" containerName="collect-profiles" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.149164 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="3399eac0-7177-46f8-9054-67a3c8debbf7" containerName="registry-server" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.149973 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-j6vqr" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.152487 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.152497 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.156593 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.156848 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-j6vqr"] Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.229167 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sr2q\" (UniqueName: \"kubernetes.io/projected/dcb1fa3f-46e3-4dc9-82db-85ea92b968a7-kube-api-access-4sr2q\") pod \"auto-csr-approver-29567072-j6vqr\" (UID: \"dcb1fa3f-46e3-4dc9-82db-85ea92b968a7\") " pod="openshift-infra/auto-csr-approver-29567072-j6vqr" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.331542 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sr2q\" (UniqueName: \"kubernetes.io/projected/dcb1fa3f-46e3-4dc9-82db-85ea92b968a7-kube-api-access-4sr2q\") pod \"auto-csr-approver-29567072-j6vqr\" (UID: \"dcb1fa3f-46e3-4dc9-82db-85ea92b968a7\") " pod="openshift-infra/auto-csr-approver-29567072-j6vqr" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.725601 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sr2q\" (UniqueName: \"kubernetes.io/projected/dcb1fa3f-46e3-4dc9-82db-85ea92b968a7-kube-api-access-4sr2q\") pod \"auto-csr-approver-29567072-j6vqr\" (UID: \"dcb1fa3f-46e3-4dc9-82db-85ea92b968a7\") " pod="openshift-infra/auto-csr-approver-29567072-j6vqr" Mar 20 16:32:00 crc kubenswrapper[4779]: I0320 16:32:00.771519 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-j6vqr" Mar 20 16:32:01 crc kubenswrapper[4779]: I0320 16:32:01.266500 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-j6vqr"] Mar 20 16:32:02 crc kubenswrapper[4779]: I0320 16:32:02.085979 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-j6vqr" event={"ID":"dcb1fa3f-46e3-4dc9-82db-85ea92b968a7","Type":"ContainerStarted","Data":"7dfd6d6c09e0f1eff1e3d77469d1976015de8ae4716e3d20a066dc0c127b19df"} Mar 20 16:32:03 crc kubenswrapper[4779]: I0320 16:32:03.095266 4779 generic.go:334] "Generic (PLEG): container finished" podID="dcb1fa3f-46e3-4dc9-82db-85ea92b968a7" containerID="028583d34ececc028795dd2728810f010c8526f674d71e3f4fcfcee01f595d76" exitCode=0 Mar 20 16:32:03 crc kubenswrapper[4779]: I0320 16:32:03.095403 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-j6vqr" event={"ID":"dcb1fa3f-46e3-4dc9-82db-85ea92b968a7","Type":"ContainerDied","Data":"028583d34ececc028795dd2728810f010c8526f674d71e3f4fcfcee01f595d76"} Mar 20 16:32:04 crc kubenswrapper[4779]: I0320 16:32:04.511153 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-j6vqr" Mar 20 16:32:04 crc kubenswrapper[4779]: I0320 16:32:04.622320 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sr2q\" (UniqueName: \"kubernetes.io/projected/dcb1fa3f-46e3-4dc9-82db-85ea92b968a7-kube-api-access-4sr2q\") pod \"dcb1fa3f-46e3-4dc9-82db-85ea92b968a7\" (UID: \"dcb1fa3f-46e3-4dc9-82db-85ea92b968a7\") " Mar 20 16:32:04 crc kubenswrapper[4779]: I0320 16:32:04.629482 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb1fa3f-46e3-4dc9-82db-85ea92b968a7-kube-api-access-4sr2q" (OuterVolumeSpecName: "kube-api-access-4sr2q") pod "dcb1fa3f-46e3-4dc9-82db-85ea92b968a7" (UID: "dcb1fa3f-46e3-4dc9-82db-85ea92b968a7"). InnerVolumeSpecName "kube-api-access-4sr2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:32:04 crc kubenswrapper[4779]: I0320 16:32:04.724726 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sr2q\" (UniqueName: \"kubernetes.io/projected/dcb1fa3f-46e3-4dc9-82db-85ea92b968a7-kube-api-access-4sr2q\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:05 crc kubenswrapper[4779]: I0320 16:32:05.114592 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-j6vqr" event={"ID":"dcb1fa3f-46e3-4dc9-82db-85ea92b968a7","Type":"ContainerDied","Data":"7dfd6d6c09e0f1eff1e3d77469d1976015de8ae4716e3d20a066dc0c127b19df"} Mar 20 16:32:05 crc kubenswrapper[4779]: I0320 16:32:05.114632 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dfd6d6c09e0f1eff1e3d77469d1976015de8ae4716e3d20a066dc0c127b19df" Mar 20 16:32:05 crc kubenswrapper[4779]: I0320 16:32:05.114684 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-j6vqr" Mar 20 16:32:05 crc kubenswrapper[4779]: I0320 16:32:05.581286 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-hcrr8"] Mar 20 16:32:05 crc kubenswrapper[4779]: I0320 16:32:05.589986 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-hcrr8"] Mar 20 16:32:05 crc kubenswrapper[4779]: I0320 16:32:05.826074 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9d8c0b-d333-4ca9-a665-b50c361263ab" path="/var/lib/kubelet/pods/ab9d8c0b-d333-4ca9-a665-b50c361263ab/volumes" Mar 20 16:32:45 crc kubenswrapper[4779]: I0320 16:32:45.530790 4779 scope.go:117] "RemoveContainer" containerID="c1de01b0b87f5a208b53f1fc764b952ef0fefc9ba06a9c47d898a6d2b9284567" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.768756 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8qsh5"] Mar 20 16:33:05 crc kubenswrapper[4779]: E0320 16:33:05.769815 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb1fa3f-46e3-4dc9-82db-85ea92b968a7" containerName="oc" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.769828 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb1fa3f-46e3-4dc9-82db-85ea92b968a7" containerName="oc" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.770043 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb1fa3f-46e3-4dc9-82db-85ea92b968a7" containerName="oc" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.771591 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.783507 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qsh5"] Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.876764 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-catalog-content\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.876858 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-utilities\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.876990 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvvc\" (UniqueName: \"kubernetes.io/projected/385e359d-507b-4187-bcde-8b0c9e32e3b5-kube-api-access-ntvvc\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.979081 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-utilities\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.979229 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvvc\" (UniqueName: \"kubernetes.io/projected/385e359d-507b-4187-bcde-8b0c9e32e3b5-kube-api-access-ntvvc\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.979311 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-catalog-content\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.979611 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-utilities\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.979896 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-catalog-content\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:05 crc kubenswrapper[4779]: I0320 16:33:05.999088 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvvc\" (UniqueName: \"kubernetes.io/projected/385e359d-507b-4187-bcde-8b0c9e32e3b5-kube-api-access-ntvvc\") pod \"certified-operators-8qsh5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:06 crc kubenswrapper[4779]: I0320 16:33:06.150641 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:06 crc kubenswrapper[4779]: I0320 16:33:06.722388 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qsh5"] Mar 20 16:33:07 crc kubenswrapper[4779]: I0320 16:33:07.743801 4779 generic.go:334] "Generic (PLEG): container finished" podID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerID="7dbad9b3e1531c1d66585d0e1eb9d034020a0db0d1147be1194a1396d107e9fd" exitCode=0 Mar 20 16:33:07 crc kubenswrapper[4779]: I0320 16:33:07.744050 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qsh5" event={"ID":"385e359d-507b-4187-bcde-8b0c9e32e3b5","Type":"ContainerDied","Data":"7dbad9b3e1531c1d66585d0e1eb9d034020a0db0d1147be1194a1396d107e9fd"} Mar 20 16:33:07 crc kubenswrapper[4779]: I0320 16:33:07.744073 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qsh5" event={"ID":"385e359d-507b-4187-bcde-8b0c9e32e3b5","Type":"ContainerStarted","Data":"60c0a424ee1c6d3c2fbf71e7534fd7520c871a9d2ff05cb4c1a5a2a5806ee732"} Mar 20 16:33:07 crc kubenswrapper[4779]: I0320 16:33:07.746893 4779 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:33:09 crc kubenswrapper[4779]: I0320 16:33:09.794542 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qsh5" event={"ID":"385e359d-507b-4187-bcde-8b0c9e32e3b5","Type":"ContainerStarted","Data":"f258c479b3f311c509ae839bd265afa1fd3e67e1e77e5f210a31a153f6384c11"} Mar 20 16:33:10 crc kubenswrapper[4779]: I0320 16:33:10.803677 4779 generic.go:334] "Generic (PLEG): container finished" podID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerID="f258c479b3f311c509ae839bd265afa1fd3e67e1e77e5f210a31a153f6384c11" exitCode=0 Mar 20 16:33:10 crc kubenswrapper[4779]: I0320 16:33:10.803744 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qsh5" event={"ID":"385e359d-507b-4187-bcde-8b0c9e32e3b5","Type":"ContainerDied","Data":"f258c479b3f311c509ae839bd265afa1fd3e67e1e77e5f210a31a153f6384c11"} Mar 20 16:33:11 crc kubenswrapper[4779]: I0320 16:33:11.822034 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qsh5" event={"ID":"385e359d-507b-4187-bcde-8b0c9e32e3b5","Type":"ContainerStarted","Data":"dc92e7bf756bd9a49c29a066364fc5f28d54489f495170175dc4b5d712f0c295"} Mar 20 16:33:11 crc kubenswrapper[4779]: I0320 16:33:11.850819 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8qsh5" podStartSLOduration=3.212468444 podStartE2EDuration="6.850801025s" podCreationTimestamp="2026-03-20 16:33:05 +0000 UTC" firstStartedPulling="2026-03-20 16:33:07.746648689 +0000 UTC m=+4204.709164479" lastFinishedPulling="2026-03-20 16:33:11.38498126 +0000 UTC m=+4208.347497060" observedRunningTime="2026-03-20 16:33:11.845645219 +0000 UTC m=+4208.808161019" watchObservedRunningTime="2026-03-20 16:33:11.850801025 +0000 UTC m=+4208.813316825" Mar 20 16:33:16 crc kubenswrapper[4779]: I0320 16:33:16.151226 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:16 crc kubenswrapper[4779]: I0320 16:33:16.151805 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:16 crc kubenswrapper[4779]: I0320 16:33:16.204935 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:16 crc kubenswrapper[4779]: I0320 16:33:16.903463 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:16 crc kubenswrapper[4779]: I0320 16:33:16.955071 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qsh5"] Mar 20 16:33:18 crc kubenswrapper[4779]: I0320 16:33:18.880024 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8qsh5" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerName="registry-server" containerID="cri-o://dc92e7bf756bd9a49c29a066364fc5f28d54489f495170175dc4b5d712f0c295" gracePeriod=2 Mar 20 16:33:19 crc kubenswrapper[4779]: I0320 16:33:19.889311 4779 generic.go:334] "Generic (PLEG): container finished" podID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerID="dc92e7bf756bd9a49c29a066364fc5f28d54489f495170175dc4b5d712f0c295" exitCode=0 Mar 20 16:33:19 crc kubenswrapper[4779]: I0320 16:33:19.889475 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qsh5" event={"ID":"385e359d-507b-4187-bcde-8b0c9e32e3b5","Type":"ContainerDied","Data":"dc92e7bf756bd9a49c29a066364fc5f28d54489f495170175dc4b5d712f0c295"} Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.014148 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.140020 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-catalog-content\") pod \"385e359d-507b-4187-bcde-8b0c9e32e3b5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.140144 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-utilities\") pod \"385e359d-507b-4187-bcde-8b0c9e32e3b5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.140199 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntvvc\" (UniqueName: \"kubernetes.io/projected/385e359d-507b-4187-bcde-8b0c9e32e3b5-kube-api-access-ntvvc\") pod \"385e359d-507b-4187-bcde-8b0c9e32e3b5\" (UID: \"385e359d-507b-4187-bcde-8b0c9e32e3b5\") " Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.141675 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-utilities" (OuterVolumeSpecName: "utilities") pod "385e359d-507b-4187-bcde-8b0c9e32e3b5" (UID: "385e359d-507b-4187-bcde-8b0c9e32e3b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.145703 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385e359d-507b-4187-bcde-8b0c9e32e3b5-kube-api-access-ntvvc" (OuterVolumeSpecName: "kube-api-access-ntvvc") pod "385e359d-507b-4187-bcde-8b0c9e32e3b5" (UID: "385e359d-507b-4187-bcde-8b0c9e32e3b5"). InnerVolumeSpecName "kube-api-access-ntvvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.242204 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntvvc\" (UniqueName: \"kubernetes.io/projected/385e359d-507b-4187-bcde-8b0c9e32e3b5-kube-api-access-ntvvc\") on node \"crc\" DevicePath \"\"" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.242238 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.741695 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "385e359d-507b-4187-bcde-8b0c9e32e3b5" (UID: "385e359d-507b-4187-bcde-8b0c9e32e3b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.751058 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/385e359d-507b-4187-bcde-8b0c9e32e3b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.899005 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qsh5" event={"ID":"385e359d-507b-4187-bcde-8b0c9e32e3b5","Type":"ContainerDied","Data":"60c0a424ee1c6d3c2fbf71e7534fd7520c871a9d2ff05cb4c1a5a2a5806ee732"} Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.899056 4779 scope.go:117] "RemoveContainer" containerID="dc92e7bf756bd9a49c29a066364fc5f28d54489f495170175dc4b5d712f0c295" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.900038 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qsh5" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.921876 4779 scope.go:117] "RemoveContainer" containerID="f258c479b3f311c509ae839bd265afa1fd3e67e1e77e5f210a31a153f6384c11" Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.946143 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qsh5"] Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.960127 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8qsh5"] Mar 20 16:33:20 crc kubenswrapper[4779]: I0320 16:33:20.969246 4779 scope.go:117] "RemoveContainer" containerID="7dbad9b3e1531c1d66585d0e1eb9d034020a0db0d1147be1194a1396d107e9fd" Mar 20 16:33:21 crc kubenswrapper[4779]: I0320 16:33:21.818969 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" path="/var/lib/kubelet/pods/385e359d-507b-4187-bcde-8b0c9e32e3b5/volumes" Mar 20 16:33:24 crc kubenswrapper[4779]: I0320 16:33:24.937049 4779 generic.go:334] "Generic (PLEG): container finished" podID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerID="fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615" exitCode=0 Mar 20 16:33:24 crc kubenswrapper[4779]: I0320 16:33:24.937158 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" event={"ID":"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c","Type":"ContainerDied","Data":"fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615"} Mar 20 16:33:24 crc kubenswrapper[4779]: I0320 16:33:24.938143 4779 scope.go:117] "RemoveContainer" containerID="fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615" Mar 20 16:33:25 crc kubenswrapper[4779]: I0320 16:33:25.149418 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:33:25 crc kubenswrapper[4779]: I0320 16:33:25.149474 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:33:25 crc kubenswrapper[4779]: I0320 16:33:25.163412 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7ggbl_must-gather-2bwvn_9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c/gather/0.log" Mar 20 16:33:33 crc kubenswrapper[4779]: I0320 16:33:33.344301 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7ggbl/must-gather-2bwvn"] Mar 20 16:33:33 crc kubenswrapper[4779]: I0320 16:33:33.345163 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" podUID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerName="copy" containerID="cri-o://737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06" gracePeriod=2 Mar 20 16:33:33 crc kubenswrapper[4779]: I0320 16:33:33.358697 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7ggbl/must-gather-2bwvn"] Mar 20 16:33:33 crc kubenswrapper[4779]: I0320 16:33:33.827892 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7ggbl_must-gather-2bwvn_9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c/copy/0.log" Mar 20 16:33:33 crc kubenswrapper[4779]: I0320 16:33:33.830024 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:33:33 crc kubenswrapper[4779]: I0320 16:33:33.992770 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-must-gather-output\") pod \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\" (UID: \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\") " Mar 20 16:33:33 crc kubenswrapper[4779]: I0320 16:33:33.993122 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr98k\" (UniqueName: \"kubernetes.io/projected/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-kube-api-access-sr98k\") pod \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\" (UID: \"9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c\") " Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.004465 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-kube-api-access-sr98k" (OuterVolumeSpecName: "kube-api-access-sr98k") pod "9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" (UID: "9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c"). InnerVolumeSpecName "kube-api-access-sr98k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.018500 4779 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7ggbl_must-gather-2bwvn_9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c/copy/0.log" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.018965 4779 generic.go:334] "Generic (PLEG): container finished" podID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerID="737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06" exitCode=143 Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.019033 4779 scope.go:117] "RemoveContainer" containerID="737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.019223 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ggbl/must-gather-2bwvn" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.056173 4779 scope.go:117] "RemoveContainer" containerID="fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.096606 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr98k\" (UniqueName: \"kubernetes.io/projected/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-kube-api-access-sr98k\") on node \"crc\" DevicePath \"\"" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.147734 4779 scope.go:117] "RemoveContainer" containerID="737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06" Mar 20 16:33:34 crc kubenswrapper[4779]: E0320 16:33:34.148131 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06\": container with ID starting with 737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06 not found: ID does not exist" containerID="737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.148167 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06"} err="failed to get container status \"737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06\": rpc error: code = NotFound desc = could not find container \"737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06\": container with ID starting with 737aed034886a78b00cdcf32aed770952f445adb6dbe6e5836e16f37266b1d06 not found: ID does not exist" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.148191 4779 scope.go:117] "RemoveContainer" containerID="fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615" Mar 20 16:33:34 crc kubenswrapper[4779]: E0320 16:33:34.149143 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615\": container with ID starting with fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615 not found: ID does not exist" containerID="fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.149178 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615"} err="failed to get container status \"fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615\": rpc error: code = NotFound desc = could not find container \"fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615\": container with ID starting with fe4cad63e5b135c7b584bc228aecaada01604dfefe621edb03e7b3c572776615 not found: ID does not exist" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.194697 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" (UID: "9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:33:34 crc kubenswrapper[4779]: I0320 16:33:34.198160 4779 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 16:33:35 crc kubenswrapper[4779]: I0320 16:33:35.821708 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" path="/var/lib/kubelet/pods/9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c/volumes" Mar 20 16:33:45 crc kubenswrapper[4779]: I0320 16:33:45.620584 4779 scope.go:117] "RemoveContainer" containerID="dc2c481eedb0e438cd366e28499039e3d402d9a3291b8b26359084f35e94ce59" Mar 20 16:33:55 crc kubenswrapper[4779]: I0320 16:33:55.150074 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:33:55 crc kubenswrapper[4779]: I0320 16:33:55.150656 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.143915 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567074-vgmt9"] Mar 20 16:34:00 crc kubenswrapper[4779]: E0320 16:34:00.144885 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.144900 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4779]: E0320 16:34:00.144915 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.144921 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4779]: E0320 16:34:00.144937 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerName="copy" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.144942 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerName="copy" Mar 20 16:34:00 crc kubenswrapper[4779]: E0320 16:34:00.144958 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerName="gather" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.144964 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerName="gather" Mar 20 16:34:00 crc kubenswrapper[4779]: E0320 16:34:00.144981 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.144987 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.145224 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerName="copy" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.145247 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ada32f2-df2f-4462-af3f-7a0d0e0b6d7c" containerName="gather" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.145260 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="385e359d-507b-4187-bcde-8b0c9e32e3b5" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.147470 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-vgmt9" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.149414 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.149717 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.150604 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.153775 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-vgmt9"] Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.319961 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bgwz\" (UniqueName: \"kubernetes.io/projected/ff96d823-2692-4f94-af13-4b22680bffa2-kube-api-access-2bgwz\") pod \"auto-csr-approver-29567074-vgmt9\" (UID: \"ff96d823-2692-4f94-af13-4b22680bffa2\") " pod="openshift-infra/auto-csr-approver-29567074-vgmt9" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.421988 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bgwz\" (UniqueName: \"kubernetes.io/projected/ff96d823-2692-4f94-af13-4b22680bffa2-kube-api-access-2bgwz\") pod \"auto-csr-approver-29567074-vgmt9\" (UID: \"ff96d823-2692-4f94-af13-4b22680bffa2\") " pod="openshift-infra/auto-csr-approver-29567074-vgmt9" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.441781 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bgwz\" (UniqueName: \"kubernetes.io/projected/ff96d823-2692-4f94-af13-4b22680bffa2-kube-api-access-2bgwz\") pod \"auto-csr-approver-29567074-vgmt9\" (UID: \"ff96d823-2692-4f94-af13-4b22680bffa2\") " pod="openshift-infra/auto-csr-approver-29567074-vgmt9" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.467038 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-vgmt9" Mar 20 16:34:00 crc kubenswrapper[4779]: I0320 16:34:00.923827 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-vgmt9"] Mar 20 16:34:01 crc kubenswrapper[4779]: I0320 16:34:01.266533 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-vgmt9" event={"ID":"ff96d823-2692-4f94-af13-4b22680bffa2","Type":"ContainerStarted","Data":"e22fb6fc546801388a90aa7e2652ceeb5b991c344b9734b724b9a7707340d7d8"} Mar 20 16:34:03 crc kubenswrapper[4779]: I0320 16:34:03.287530 4779 generic.go:334] "Generic (PLEG): container finished" podID="ff96d823-2692-4f94-af13-4b22680bffa2" containerID="e8e90b3a7a8aaa14e09b9b9945b24469c584f715f9bc6b1860fd38e3d8e57892" exitCode=0 Mar 20 16:34:03 crc kubenswrapper[4779]: I0320 16:34:03.287686 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-vgmt9" event={"ID":"ff96d823-2692-4f94-af13-4b22680bffa2","Type":"ContainerDied","Data":"e8e90b3a7a8aaa14e09b9b9945b24469c584f715f9bc6b1860fd38e3d8e57892"} Mar 20 16:34:04 crc kubenswrapper[4779]: I0320 16:34:04.655889 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-vgmt9" Mar 20 16:34:04 crc kubenswrapper[4779]: I0320 16:34:04.709016 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bgwz\" (UniqueName: \"kubernetes.io/projected/ff96d823-2692-4f94-af13-4b22680bffa2-kube-api-access-2bgwz\") pod \"ff96d823-2692-4f94-af13-4b22680bffa2\" (UID: \"ff96d823-2692-4f94-af13-4b22680bffa2\") " Mar 20 16:34:04 crc kubenswrapper[4779]: I0320 16:34:04.720670 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff96d823-2692-4f94-af13-4b22680bffa2-kube-api-access-2bgwz" (OuterVolumeSpecName: "kube-api-access-2bgwz") pod "ff96d823-2692-4f94-af13-4b22680bffa2" (UID: "ff96d823-2692-4f94-af13-4b22680bffa2"). InnerVolumeSpecName "kube-api-access-2bgwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:34:04 crc kubenswrapper[4779]: I0320 16:34:04.811501 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bgwz\" (UniqueName: \"kubernetes.io/projected/ff96d823-2692-4f94-af13-4b22680bffa2-kube-api-access-2bgwz\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:05 crc kubenswrapper[4779]: I0320 16:34:05.310403 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-vgmt9" event={"ID":"ff96d823-2692-4f94-af13-4b22680bffa2","Type":"ContainerDied","Data":"e22fb6fc546801388a90aa7e2652ceeb5b991c344b9734b724b9a7707340d7d8"} Mar 20 16:34:05 crc kubenswrapper[4779]: I0320 16:34:05.310447 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-vgmt9" Mar 20 16:34:05 crc kubenswrapper[4779]: I0320 16:34:05.310468 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e22fb6fc546801388a90aa7e2652ceeb5b991c344b9734b724b9a7707340d7d8" Mar 20 16:34:05 crc kubenswrapper[4779]: I0320 16:34:05.720202 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-ztzgc"] Mar 20 16:34:05 crc kubenswrapper[4779]: I0320 16:34:05.729178 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-ztzgc"] Mar 20 16:34:05 crc kubenswrapper[4779]: I0320 16:34:05.825165 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4338ac48-bf07-407c-a6fa-2826b7ff51cf" path="/var/lib/kubelet/pods/4338ac48-bf07-407c-a6fa-2826b7ff51cf/volumes" Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.150545 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.151247 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.151297 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.152229 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bae06906235ec92fdf13286c3daf0b54ffa138e8c7e77f0062cb5ff877c3d7b6"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.152305 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://bae06906235ec92fdf13286c3daf0b54ffa138e8c7e77f0062cb5ff877c3d7b6" gracePeriod=600 Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.483651 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="bae06906235ec92fdf13286c3daf0b54ffa138e8c7e77f0062cb5ff877c3d7b6" exitCode=0 Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.483863 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"bae06906235ec92fdf13286c3daf0b54ffa138e8c7e77f0062cb5ff877c3d7b6"} Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.484009 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerStarted","Data":"e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083"} Mar 20 16:34:25 crc kubenswrapper[4779]: I0320 16:34:25.484034 4779 scope.go:117] "RemoveContainer" containerID="0d4a4381088e0c6e4439f94508ac137582cc59c538515a9138610168fa6ff57d" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.491993 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q55ss"] Mar 20 16:34:30 crc kubenswrapper[4779]: E0320 16:34:30.493103 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff96d823-2692-4f94-af13-4b22680bffa2" containerName="oc" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.493154 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff96d823-2692-4f94-af13-4b22680bffa2" containerName="oc" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.493351 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff96d823-2692-4f94-af13-4b22680bffa2" containerName="oc" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.496093 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.506865 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q55ss"] Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.600456 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-catalog-content\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.600520 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/eabe3b81-00ac-49e9-8571-03bc7a8c9561-kube-api-access-565ml\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.600595 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-utilities\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.702292 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-utilities\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.702415 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-catalog-content\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.702447 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/eabe3b81-00ac-49e9-8571-03bc7a8c9561-kube-api-access-565ml\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.702923 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-utilities\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.706279 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-catalog-content\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.722499 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/eabe3b81-00ac-49e9-8571-03bc7a8c9561-kube-api-access-565ml\") pod \"community-operators-q55ss\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:30 crc kubenswrapper[4779]: I0320 16:34:30.835031 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:31 crc kubenswrapper[4779]: I0320 16:34:31.371341 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q55ss"] Mar 20 16:34:32 crc kubenswrapper[4779]: I0320 16:34:32.569847 4779 generic.go:334] "Generic (PLEG): container finished" podID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerID="0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648" exitCode=0 Mar 20 16:34:32 crc kubenswrapper[4779]: I0320 16:34:32.569917 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q55ss" event={"ID":"eabe3b81-00ac-49e9-8571-03bc7a8c9561","Type":"ContainerDied","Data":"0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648"} Mar 20 16:34:32 crc kubenswrapper[4779]: I0320 16:34:32.571211 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q55ss" event={"ID":"eabe3b81-00ac-49e9-8571-03bc7a8c9561","Type":"ContainerStarted","Data":"e97fb5f1275464b049f8203b956f713f593a2e7a45ecb2613d5997f96d46d14f"} Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.287255 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j8wzv"] Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.290014 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.308302 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8wzv"] Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.351242 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sknxh\" (UniqueName: \"kubernetes.io/projected/5df520bd-0746-48f5-b615-0e3707572c01-kube-api-access-sknxh\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.351450 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-utilities\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.351701 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-catalog-content\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.455250 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-utilities\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.455358 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-catalog-content\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.455441 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sknxh\" (UniqueName: \"kubernetes.io/projected/5df520bd-0746-48f5-b615-0e3707572c01-kube-api-access-sknxh\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.456198 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-utilities\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.456330 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-catalog-content\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.827578 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sknxh\" (UniqueName: \"kubernetes.io/projected/5df520bd-0746-48f5-b615-0e3707572c01-kube-api-access-sknxh\") pod \"redhat-marketplace-j8wzv\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:33 crc kubenswrapper[4779]: I0320 16:34:33.912012 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:34 crc kubenswrapper[4779]: I0320 16:34:34.387364 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8wzv"] Mar 20 16:34:34 crc kubenswrapper[4779]: I0320 16:34:34.597490 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8wzv" event={"ID":"5df520bd-0746-48f5-b615-0e3707572c01","Type":"ContainerStarted","Data":"20da044ea3c96172a881cfcc9124338b1644db8826a74b64e7ec454abc5d31af"} Mar 20 16:34:34 crc kubenswrapper[4779]: I0320 16:34:34.601844 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q55ss" event={"ID":"eabe3b81-00ac-49e9-8571-03bc7a8c9561","Type":"ContainerStarted","Data":"70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896"} Mar 20 16:34:35 crc kubenswrapper[4779]: I0320 16:34:35.611504 4779 generic.go:334] "Generic (PLEG): container finished" podID="5df520bd-0746-48f5-b615-0e3707572c01" containerID="8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95" exitCode=0 Mar 20 16:34:35 crc kubenswrapper[4779]: I0320 16:34:35.611568 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8wzv" event={"ID":"5df520bd-0746-48f5-b615-0e3707572c01","Type":"ContainerDied","Data":"8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95"} Mar 20 16:34:36 crc kubenswrapper[4779]: I0320 16:34:36.622532 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8wzv" event={"ID":"5df520bd-0746-48f5-b615-0e3707572c01","Type":"ContainerStarted","Data":"7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a"} Mar 20 16:34:36 crc kubenswrapper[4779]: I0320 16:34:36.626645 4779 generic.go:334] "Generic (PLEG): container finished" podID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerID="70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896" exitCode=0 Mar 20 16:34:36 crc kubenswrapper[4779]: I0320 16:34:36.626675 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q55ss" event={"ID":"eabe3b81-00ac-49e9-8571-03bc7a8c9561","Type":"ContainerDied","Data":"70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896"} Mar 20 16:34:37 crc kubenswrapper[4779]: I0320 16:34:37.643134 4779 generic.go:334] "Generic (PLEG): container finished" podID="5df520bd-0746-48f5-b615-0e3707572c01" containerID="7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a" exitCode=0 Mar 20 16:34:37 crc kubenswrapper[4779]: I0320 16:34:37.643236 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8wzv" event={"ID":"5df520bd-0746-48f5-b615-0e3707572c01","Type":"ContainerDied","Data":"7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a"} Mar 20 16:34:37 crc kubenswrapper[4779]: I0320 16:34:37.650284 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q55ss" event={"ID":"eabe3b81-00ac-49e9-8571-03bc7a8c9561","Type":"ContainerStarted","Data":"75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76"} Mar 20 16:34:38 crc kubenswrapper[4779]: I0320 16:34:38.672621 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8wzv" event={"ID":"5df520bd-0746-48f5-b615-0e3707572c01","Type":"ContainerStarted","Data":"e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71"} Mar 20 16:34:38 crc kubenswrapper[4779]: I0320 16:34:38.694921 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q55ss" podStartSLOduration=4.198854413 podStartE2EDuration="8.694891703s" podCreationTimestamp="2026-03-20 16:34:30 +0000 UTC" firstStartedPulling="2026-03-20 16:34:32.571403577 +0000 UTC m=+4289.533919377" lastFinishedPulling="2026-03-20 16:34:37.067440867 +0000 UTC m=+4294.029956667" observedRunningTime="2026-03-20 16:34:37.682064776 +0000 UTC m=+4294.644580576" watchObservedRunningTime="2026-03-20 16:34:38.694891703 +0000 UTC m=+4295.657407493" Mar 20 16:34:38 crc kubenswrapper[4779]: I0320 16:34:38.698634 4779 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j8wzv" podStartSLOduration=3.237703778 podStartE2EDuration="5.698622504s" podCreationTimestamp="2026-03-20 16:34:33 +0000 UTC" firstStartedPulling="2026-03-20 16:34:35.613527466 +0000 UTC m=+4292.576043266" lastFinishedPulling="2026-03-20 16:34:38.074446192 +0000 UTC m=+4295.036961992" observedRunningTime="2026-03-20 16:34:38.68737889 +0000 UTC m=+4295.649894700" watchObservedRunningTime="2026-03-20 16:34:38.698622504 +0000 UTC m=+4295.661138304" Mar 20 16:34:40 crc kubenswrapper[4779]: I0320 16:34:40.835533 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:40 crc kubenswrapper[4779]: I0320 16:34:40.836146 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:40 crc kubenswrapper[4779]: I0320 16:34:40.885196 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:43 crc kubenswrapper[4779]: I0320 16:34:43.912523 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:43 crc kubenswrapper[4779]: I0320 16:34:43.913139 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:43 crc kubenswrapper[4779]: I0320 16:34:43.980428 4779 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:44 crc kubenswrapper[4779]: I0320 16:34:44.767349 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:44 crc kubenswrapper[4779]: I0320 16:34:44.878337 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8wzv"] Mar 20 16:34:45 crc kubenswrapper[4779]: I0320 16:34:45.703407 4779 scope.go:117] "RemoveContainer" containerID="86f451a0e6d9046e96eb3a71046429c635c0b90e42f88e777c9cf6383ce1fb0c" Mar 20 16:34:46 crc kubenswrapper[4779]: I0320 16:34:46.741649 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j8wzv" podUID="5df520bd-0746-48f5-b615-0e3707572c01" containerName="registry-server" containerID="cri-o://e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71" gracePeriod=2 Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.214442 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.332875 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-catalog-content\") pod \"5df520bd-0746-48f5-b615-0e3707572c01\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.333346 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-utilities\") pod \"5df520bd-0746-48f5-b615-0e3707572c01\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.333393 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sknxh\" (UniqueName: \"kubernetes.io/projected/5df520bd-0746-48f5-b615-0e3707572c01-kube-api-access-sknxh\") pod \"5df520bd-0746-48f5-b615-0e3707572c01\" (UID: \"5df520bd-0746-48f5-b615-0e3707572c01\") " Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.334363 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-utilities" (OuterVolumeSpecName: "utilities") pod "5df520bd-0746-48f5-b615-0e3707572c01" (UID: "5df520bd-0746-48f5-b615-0e3707572c01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.339038 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df520bd-0746-48f5-b615-0e3707572c01-kube-api-access-sknxh" (OuterVolumeSpecName: "kube-api-access-sknxh") pod "5df520bd-0746-48f5-b615-0e3707572c01" (UID: "5df520bd-0746-48f5-b615-0e3707572c01"). InnerVolumeSpecName "kube-api-access-sknxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.365538 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5df520bd-0746-48f5-b615-0e3707572c01" (UID: "5df520bd-0746-48f5-b615-0e3707572c01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.436054 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.436087 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df520bd-0746-48f5-b615-0e3707572c01-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.436097 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sknxh\" (UniqueName: \"kubernetes.io/projected/5df520bd-0746-48f5-b615-0e3707572c01-kube-api-access-sknxh\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.753037 4779 generic.go:334] "Generic (PLEG): container finished" podID="5df520bd-0746-48f5-b615-0e3707572c01" containerID="e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71" exitCode=0 Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.753086 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8wzv" event={"ID":"5df520bd-0746-48f5-b615-0e3707572c01","Type":"ContainerDied","Data":"e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71"} Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.753152 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8wzv" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.753167 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8wzv" event={"ID":"5df520bd-0746-48f5-b615-0e3707572c01","Type":"ContainerDied","Data":"20da044ea3c96172a881cfcc9124338b1644db8826a74b64e7ec454abc5d31af"} Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.753196 4779 scope.go:117] "RemoveContainer" containerID="e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.784221 4779 scope.go:117] "RemoveContainer" containerID="7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.795179 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8wzv"] Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.803564 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8wzv"] Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.803959 4779 scope.go:117] "RemoveContainer" containerID="8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.820482 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df520bd-0746-48f5-b615-0e3707572c01" path="/var/lib/kubelet/pods/5df520bd-0746-48f5-b615-0e3707572c01/volumes" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.852986 4779 scope.go:117] "RemoveContainer" containerID="e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71" Mar 20 16:34:47 crc kubenswrapper[4779]: E0320 16:34:47.853450 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71\": container with ID starting with e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71 not found: ID does not exist" containerID="e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.853480 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71"} err="failed to get container status \"e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71\": rpc error: code = NotFound desc = could not find container \"e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71\": container with ID starting with e111362ca2f132909893d664f6ea5ef8bf256d504f0fa8e2f4423fa9de6e5f71 not found: ID does not exist" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.853503 4779 scope.go:117] "RemoveContainer" containerID="7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a" Mar 20 16:34:47 crc kubenswrapper[4779]: E0320 16:34:47.853890 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a\": container with ID starting with 7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a not found: ID does not exist" containerID="7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.853933 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a"} err="failed to get container status \"7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a\": rpc error: code = NotFound desc = could not find container \"7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a\": container with ID starting with 7252cd305a22367916eb1fc7479dd7213a994d09c39a3e1a1f90271b70d4301a not found: ID does not exist" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.853959 4779 scope.go:117] "RemoveContainer" containerID="8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95" Mar 20 16:34:47 crc kubenswrapper[4779]: E0320 16:34:47.854403 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95\": container with ID starting with 8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95 not found: ID does not exist" containerID="8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95" Mar 20 16:34:47 crc kubenswrapper[4779]: I0320 16:34:47.854456 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95"} err="failed to get container status \"8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95\": rpc error: code = NotFound desc = could not find container \"8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95\": container with ID starting with 8f08f1e146be3e8deab960374a3d23b245cffc56201e990e6d4d16eecaa3bd95 not found: ID does not exist" Mar 20 16:34:50 crc kubenswrapper[4779]: I0320 16:34:50.880333 4779 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:50 crc kubenswrapper[4779]: I0320 16:34:50.928552 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q55ss"] Mar 20 16:34:51 crc kubenswrapper[4779]: I0320 16:34:51.789065 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q55ss" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerName="registry-server" containerID="cri-o://75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76" gracePeriod=2 Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.262384 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.348568 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-catalog-content\") pod \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.348723 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/eabe3b81-00ac-49e9-8571-03bc7a8c9561-kube-api-access-565ml\") pod \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.348764 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-utilities\") pod \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\" (UID: \"eabe3b81-00ac-49e9-8571-03bc7a8c9561\") " Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.349803 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-utilities" (OuterVolumeSpecName: "utilities") pod "eabe3b81-00ac-49e9-8571-03bc7a8c9561" (UID: "eabe3b81-00ac-49e9-8571-03bc7a8c9561"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.354319 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eabe3b81-00ac-49e9-8571-03bc7a8c9561-kube-api-access-565ml" (OuterVolumeSpecName: "kube-api-access-565ml") pod "eabe3b81-00ac-49e9-8571-03bc7a8c9561" (UID: "eabe3b81-00ac-49e9-8571-03bc7a8c9561"). InnerVolumeSpecName "kube-api-access-565ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.401185 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eabe3b81-00ac-49e9-8571-03bc7a8c9561" (UID: "eabe3b81-00ac-49e9-8571-03bc7a8c9561"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.450566 4779 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.450602 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/eabe3b81-00ac-49e9-8571-03bc7a8c9561-kube-api-access-565ml\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.450614 4779 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabe3b81-00ac-49e9-8571-03bc7a8c9561-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.804765 4779 generic.go:334] "Generic (PLEG): container finished" podID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerID="75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76" exitCode=0 Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.804819 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q55ss" event={"ID":"eabe3b81-00ac-49e9-8571-03bc7a8c9561","Type":"ContainerDied","Data":"75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76"} Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.804853 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q55ss" event={"ID":"eabe3b81-00ac-49e9-8571-03bc7a8c9561","Type":"ContainerDied","Data":"e97fb5f1275464b049f8203b956f713f593a2e7a45ecb2613d5997f96d46d14f"} Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.804874 4779 scope.go:117] "RemoveContainer" containerID="75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.805037 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q55ss" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.835434 4779 scope.go:117] "RemoveContainer" containerID="70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.841852 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q55ss"] Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.850647 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q55ss"] Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.854905 4779 scope.go:117] "RemoveContainer" containerID="0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.894968 4779 scope.go:117] "RemoveContainer" containerID="75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76" Mar 20 16:34:52 crc kubenswrapper[4779]: E0320 16:34:52.895681 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76\": container with ID starting with 75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76 not found: ID does not exist" containerID="75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.895718 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76"} err="failed to get container status \"75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76\": rpc error: code = NotFound desc = could not find container \"75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76\": container with ID starting with 75da4dd431b5a3b19caf4a44d7b38520a5acbcf5571ddd33b84a39b68f22bb76 not found: ID does not exist" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.895740 4779 scope.go:117] "RemoveContainer" containerID="70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896" Mar 20 16:34:52 crc kubenswrapper[4779]: E0320 16:34:52.895992 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896\": container with ID starting with 70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896 not found: ID does not exist" containerID="70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.896022 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896"} err="failed to get container status \"70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896\": rpc error: code = NotFound desc = could not find container \"70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896\": container with ID starting with 70d4086bf2d19ed3d132cad3da37172c43fba1695e9c58b6e702d40678883896 not found: ID does not exist" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.896040 4779 scope.go:117] "RemoveContainer" containerID="0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648" Mar 20 16:34:52 crc kubenswrapper[4779]: E0320 16:34:52.896254 4779 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648\": container with ID starting with 0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648 not found: ID does not exist" containerID="0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648" Mar 20 16:34:52 crc kubenswrapper[4779]: I0320 16:34:52.896285 4779 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648"} err="failed to get container status \"0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648\": rpc error: code = NotFound desc = could not find container \"0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648\": container with ID starting with 0f99d7414e928dd82c3d799ca8455cb653e6e521e9955a02d517a66eed648648 not found: ID does not exist" Mar 20 16:34:53 crc kubenswrapper[4779]: I0320 16:34:53.818889 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" path="/var/lib/kubelet/pods/eabe3b81-00ac-49e9-8571-03bc7a8c9561/volumes" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.175250 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567076-q4cmx"] Mar 20 16:36:00 crc kubenswrapper[4779]: E0320 16:36:00.176316 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerName="extract-content" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.176335 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerName="extract-content" Mar 20 16:36:00 crc kubenswrapper[4779]: E0320 16:36:00.176347 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df520bd-0746-48f5-b615-0e3707572c01" containerName="extract-utilities" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.176354 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df520bd-0746-48f5-b615-0e3707572c01" containerName="extract-utilities" Mar 20 16:36:00 crc kubenswrapper[4779]: E0320 16:36:00.176370 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df520bd-0746-48f5-b615-0e3707572c01" containerName="registry-server" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.176379 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df520bd-0746-48f5-b615-0e3707572c01" containerName="registry-server" Mar 20 16:36:00 crc kubenswrapper[4779]: E0320 16:36:00.176395 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerName="extract-utilities" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.176402 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerName="extract-utilities" Mar 20 16:36:00 crc kubenswrapper[4779]: E0320 16:36:00.176453 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df520bd-0746-48f5-b615-0e3707572c01" containerName="extract-content" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.176460 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df520bd-0746-48f5-b615-0e3707572c01" containerName="extract-content" Mar 20 16:36:00 crc kubenswrapper[4779]: E0320 16:36:00.176470 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerName="registry-server" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.176477 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerName="registry-server" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.176696 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df520bd-0746-48f5-b615-0e3707572c01" containerName="registry-server" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.176710 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="eabe3b81-00ac-49e9-8571-03bc7a8c9561" containerName="registry-server" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.177533 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-q4cmx" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.180601 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.180855 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.190298 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.198973 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567076-q4cmx"] Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.335909 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrj65\" (UniqueName: \"kubernetes.io/projected/74d3c78f-071d-4d10-b283-efa4aa686af0-kube-api-access-rrj65\") pod \"auto-csr-approver-29567076-q4cmx\" (UID: \"74d3c78f-071d-4d10-b283-efa4aa686af0\") " pod="openshift-infra/auto-csr-approver-29567076-q4cmx" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.437867 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrj65\" (UniqueName: \"kubernetes.io/projected/74d3c78f-071d-4d10-b283-efa4aa686af0-kube-api-access-rrj65\") pod \"auto-csr-approver-29567076-q4cmx\" (UID: \"74d3c78f-071d-4d10-b283-efa4aa686af0\") " pod="openshift-infra/auto-csr-approver-29567076-q4cmx" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.459976 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrj65\" (UniqueName: \"kubernetes.io/projected/74d3c78f-071d-4d10-b283-efa4aa686af0-kube-api-access-rrj65\") pod \"auto-csr-approver-29567076-q4cmx\" (UID: \"74d3c78f-071d-4d10-b283-efa4aa686af0\") " pod="openshift-infra/auto-csr-approver-29567076-q4cmx" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.495048 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-q4cmx" Mar 20 16:36:00 crc kubenswrapper[4779]: I0320 16:36:00.929533 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567076-q4cmx"] Mar 20 16:36:01 crc kubenswrapper[4779]: I0320 16:36:01.099571 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-q4cmx" event={"ID":"74d3c78f-071d-4d10-b283-efa4aa686af0","Type":"ContainerStarted","Data":"f7cc77bb904779394c77273548dc57a30139226f564500302dfedcea017a5e4b"} Mar 20 16:36:03 crc kubenswrapper[4779]: I0320 16:36:03.117642 4779 generic.go:334] "Generic (PLEG): container finished" podID="74d3c78f-071d-4d10-b283-efa4aa686af0" containerID="61b772247cf89814f2930796c1574787cd6be6458d4625edd33f912814bb159b" exitCode=0 Mar 20 16:36:03 crc kubenswrapper[4779]: I0320 16:36:03.117734 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-q4cmx" event={"ID":"74d3c78f-071d-4d10-b283-efa4aa686af0","Type":"ContainerDied","Data":"61b772247cf89814f2930796c1574787cd6be6458d4625edd33f912814bb159b"} Mar 20 16:36:04 crc kubenswrapper[4779]: I0320 16:36:04.592184 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-q4cmx" Mar 20 16:36:04 crc kubenswrapper[4779]: I0320 16:36:04.720239 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrj65\" (UniqueName: \"kubernetes.io/projected/74d3c78f-071d-4d10-b283-efa4aa686af0-kube-api-access-rrj65\") pod \"74d3c78f-071d-4d10-b283-efa4aa686af0\" (UID: \"74d3c78f-071d-4d10-b283-efa4aa686af0\") " Mar 20 16:36:04 crc kubenswrapper[4779]: I0320 16:36:04.732342 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d3c78f-071d-4d10-b283-efa4aa686af0-kube-api-access-rrj65" (OuterVolumeSpecName: "kube-api-access-rrj65") pod "74d3c78f-071d-4d10-b283-efa4aa686af0" (UID: "74d3c78f-071d-4d10-b283-efa4aa686af0"). InnerVolumeSpecName "kube-api-access-rrj65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:36:04 crc kubenswrapper[4779]: I0320 16:36:04.823730 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrj65\" (UniqueName: \"kubernetes.io/projected/74d3c78f-071d-4d10-b283-efa4aa686af0-kube-api-access-rrj65\") on node \"crc\" DevicePath \"\"" Mar 20 16:36:05 crc kubenswrapper[4779]: I0320 16:36:05.140184 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-q4cmx" event={"ID":"74d3c78f-071d-4d10-b283-efa4aa686af0","Type":"ContainerDied","Data":"f7cc77bb904779394c77273548dc57a30139226f564500302dfedcea017a5e4b"} Mar 20 16:36:05 crc kubenswrapper[4779]: I0320 16:36:05.140259 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7cc77bb904779394c77273548dc57a30139226f564500302dfedcea017a5e4b" Mar 20 16:36:05 crc kubenswrapper[4779]: I0320 16:36:05.140344 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-q4cmx" Mar 20 16:36:05 crc kubenswrapper[4779]: I0320 16:36:05.689286 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-qwnmr"] Mar 20 16:36:05 crc kubenswrapper[4779]: I0320 16:36:05.699434 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-qwnmr"] Mar 20 16:36:05 crc kubenswrapper[4779]: I0320 16:36:05.823209 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b97042-5c7d-47fe-a06f-83ed3bd32049" path="/var/lib/kubelet/pods/e3b97042-5c7d-47fe-a06f-83ed3bd32049/volumes" Mar 20 16:36:25 crc kubenswrapper[4779]: I0320 16:36:25.150486 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:36:25 crc kubenswrapper[4779]: I0320 16:36:25.151050 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:36:45 crc kubenswrapper[4779]: I0320 16:36:45.822339 4779 scope.go:117] "RemoveContainer" containerID="6cbbf534388b2499e5be8799926a6c0dc9db2d3f53e1b0db1da60e1c3bf471d7" Mar 20 16:36:55 crc kubenswrapper[4779]: I0320 16:36:55.150123 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:36:55 crc kubenswrapper[4779]: I0320 16:36:55.150643 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:37:25 crc kubenswrapper[4779]: I0320 16:37:25.150376 4779 patch_prober.go:28] interesting pod/machine-config-daemon-fs4qg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:37:25 crc kubenswrapper[4779]: I0320 16:37:25.151035 4779 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:37:25 crc kubenswrapper[4779]: I0320 16:37:25.151079 4779 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" Mar 20 16:37:25 crc kubenswrapper[4779]: I0320 16:37:25.151883 4779 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083"} pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:37:25 crc kubenswrapper[4779]: I0320 16:37:25.151941 4779 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" containerName="machine-config-daemon" containerID="cri-o://e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" gracePeriod=600 Mar 20 16:37:25 crc kubenswrapper[4779]: E0320 16:37:25.276437 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:37:26 crc kubenswrapper[4779]: I0320 16:37:26.138247 4779 generic.go:334] "Generic (PLEG): container finished" podID="451fc579-db57-4b36-a775-6d2986de3efc" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" exitCode=0 Mar 20 16:37:26 crc kubenswrapper[4779]: I0320 16:37:26.138291 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" event={"ID":"451fc579-db57-4b36-a775-6d2986de3efc","Type":"ContainerDied","Data":"e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083"} Mar 20 16:37:26 crc kubenswrapper[4779]: I0320 16:37:26.138325 4779 scope.go:117] "RemoveContainer" containerID="bae06906235ec92fdf13286c3daf0b54ffa138e8c7e77f0062cb5ff877c3d7b6" Mar 20 16:37:26 crc kubenswrapper[4779]: I0320 16:37:26.138912 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:37:26 crc kubenswrapper[4779]: E0320 16:37:26.139187 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:37:37 crc kubenswrapper[4779]: I0320 16:37:37.809443 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:37:37 crc kubenswrapper[4779]: E0320 16:37:37.810123 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:37:50 crc kubenswrapper[4779]: I0320 16:37:50.808853 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:37:50 crc kubenswrapper[4779]: E0320 16:37:50.809829 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.156076 4779 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567078-tpnjk"] Mar 20 16:38:00 crc kubenswrapper[4779]: E0320 16:38:00.157045 4779 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d3c78f-071d-4d10-b283-efa4aa686af0" containerName="oc" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.157061 4779 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d3c78f-071d-4d10-b283-efa4aa686af0" containerName="oc" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.157270 4779 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d3c78f-071d-4d10-b283-efa4aa686af0" containerName="oc" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.157918 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567078-tpnjk" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.160236 4779 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-92t9d" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.160701 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.160714 4779 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.172880 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567078-tpnjk"] Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.272653 4779 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h666\" (UniqueName: \"kubernetes.io/projected/314fc6f1-cf4c-4c83-a11c-164cab79446d-kube-api-access-9h666\") pod \"auto-csr-approver-29567078-tpnjk\" (UID: \"314fc6f1-cf4c-4c83-a11c-164cab79446d\") " pod="openshift-infra/auto-csr-approver-29567078-tpnjk" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.374900 4779 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h666\" (UniqueName: \"kubernetes.io/projected/314fc6f1-cf4c-4c83-a11c-164cab79446d-kube-api-access-9h666\") pod \"auto-csr-approver-29567078-tpnjk\" (UID: \"314fc6f1-cf4c-4c83-a11c-164cab79446d\") " pod="openshift-infra/auto-csr-approver-29567078-tpnjk" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.397325 4779 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h666\" (UniqueName: \"kubernetes.io/projected/314fc6f1-cf4c-4c83-a11c-164cab79446d-kube-api-access-9h666\") pod \"auto-csr-approver-29567078-tpnjk\" (UID: \"314fc6f1-cf4c-4c83-a11c-164cab79446d\") " pod="openshift-infra/auto-csr-approver-29567078-tpnjk" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.477169 4779 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567078-tpnjk" Mar 20 16:38:00 crc kubenswrapper[4779]: I0320 16:38:00.918878 4779 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567078-tpnjk"] Mar 20 16:38:01 crc kubenswrapper[4779]: I0320 16:38:01.505386 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567078-tpnjk" event={"ID":"314fc6f1-cf4c-4c83-a11c-164cab79446d","Type":"ContainerStarted","Data":"abf620cd9ae59eb26f4a553d3136c0e2dd2125233a62662e77421d60ad32e053"} Mar 20 16:38:01 crc kubenswrapper[4779]: I0320 16:38:01.809717 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:38:01 crc kubenswrapper[4779]: E0320 16:38:01.809991 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:38:03 crc kubenswrapper[4779]: I0320 16:38:03.522319 4779 generic.go:334] "Generic (PLEG): container finished" podID="314fc6f1-cf4c-4c83-a11c-164cab79446d" containerID="929da75ff81292d26b44ad84ca58fa25a8fdc677b4bfc6fbb426df3b594cf11f" exitCode=0 Mar 20 16:38:03 crc kubenswrapper[4779]: I0320 16:38:03.522377 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567078-tpnjk" event={"ID":"314fc6f1-cf4c-4c83-a11c-164cab79446d","Type":"ContainerDied","Data":"929da75ff81292d26b44ad84ca58fa25a8fdc677b4bfc6fbb426df3b594cf11f"} Mar 20 16:38:05 crc kubenswrapper[4779]: I0320 16:38:05.025820 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567078-tpnjk" Mar 20 16:38:05 crc kubenswrapper[4779]: I0320 16:38:05.171554 4779 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h666\" (UniqueName: \"kubernetes.io/projected/314fc6f1-cf4c-4c83-a11c-164cab79446d-kube-api-access-9h666\") pod \"314fc6f1-cf4c-4c83-a11c-164cab79446d\" (UID: \"314fc6f1-cf4c-4c83-a11c-164cab79446d\") " Mar 20 16:38:05 crc kubenswrapper[4779]: I0320 16:38:05.177354 4779 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314fc6f1-cf4c-4c83-a11c-164cab79446d-kube-api-access-9h666" (OuterVolumeSpecName: "kube-api-access-9h666") pod "314fc6f1-cf4c-4c83-a11c-164cab79446d" (UID: "314fc6f1-cf4c-4c83-a11c-164cab79446d"). InnerVolumeSpecName "kube-api-access-9h666". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:38:05 crc kubenswrapper[4779]: I0320 16:38:05.274022 4779 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h666\" (UniqueName: \"kubernetes.io/projected/314fc6f1-cf4c-4c83-a11c-164cab79446d-kube-api-access-9h666\") on node \"crc\" DevicePath \"\"" Mar 20 16:38:05 crc kubenswrapper[4779]: I0320 16:38:05.541945 4779 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567078-tpnjk" event={"ID":"314fc6f1-cf4c-4c83-a11c-164cab79446d","Type":"ContainerDied","Data":"abf620cd9ae59eb26f4a553d3136c0e2dd2125233a62662e77421d60ad32e053"} Mar 20 16:38:05 crc kubenswrapper[4779]: I0320 16:38:05.541992 4779 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf620cd9ae59eb26f4a553d3136c0e2dd2125233a62662e77421d60ad32e053" Mar 20 16:38:05 crc kubenswrapper[4779]: I0320 16:38:05.541996 4779 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567078-tpnjk" Mar 20 16:38:06 crc kubenswrapper[4779]: I0320 16:38:06.109494 4779 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-j6vqr"] Mar 20 16:38:06 crc kubenswrapper[4779]: I0320 16:38:06.119956 4779 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-j6vqr"] Mar 20 16:38:07 crc kubenswrapper[4779]: I0320 16:38:07.822345 4779 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb1fa3f-46e3-4dc9-82db-85ea92b968a7" path="/var/lib/kubelet/pods/dcb1fa3f-46e3-4dc9-82db-85ea92b968a7/volumes" Mar 20 16:38:14 crc kubenswrapper[4779]: I0320 16:38:14.809369 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:38:14 crc kubenswrapper[4779]: E0320 16:38:14.810133 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:38:27 crc kubenswrapper[4779]: I0320 16:38:27.809283 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:38:27 crc kubenswrapper[4779]: E0320 16:38:27.810258 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:38:39 crc kubenswrapper[4779]: I0320 16:38:39.810212 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:38:39 crc kubenswrapper[4779]: E0320 16:38:39.811212 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:38:45 crc kubenswrapper[4779]: I0320 16:38:45.913267 4779 scope.go:117] "RemoveContainer" containerID="028583d34ececc028795dd2728810f010c8526f674d71e3f4fcfcee01f595d76" Mar 20 16:38:50 crc kubenswrapper[4779]: I0320 16:38:50.809135 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:38:50 crc kubenswrapper[4779]: E0320 16:38:50.810057 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:39:03 crc kubenswrapper[4779]: I0320 16:39:03.815798 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:39:03 crc kubenswrapper[4779]: E0320 16:39:03.816595 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc" Mar 20 16:39:15 crc kubenswrapper[4779]: I0320 16:39:15.809906 4779 scope.go:117] "RemoveContainer" containerID="e3b1c6a073a5beafb71de9c08ba428f8b4804812454d79d9c05881f1ac273083" Mar 20 16:39:15 crc kubenswrapper[4779]: E0320 16:39:15.810806 4779 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fs4qg_openshift-machine-config-operator(451fc579-db57-4b36-a775-6d2986de3efc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fs4qg" podUID="451fc579-db57-4b36-a775-6d2986de3efc"